Security and feudalism: Own or be pwned

From The Cory Doctorow Wiki

Quote[edit | edit source]

Content[edit | edit source]

Video[edit | edit source]

Audio[edit | edit source]

Metadata[edit | edit source]

Published[edit | edit source]

December 1, 2016

Website[edit | edit source]


Download[edit | edit source]

Ogg (20 MB).

About[edit | edit source]

«Cory Doctorow explains how EFF is battling the perfect storm of bad security, abusive business practices, and threats to the very nature of property itself, fighting for a future where our devices can be configured to do our bidding and where security researchers are always free to tell us what they've learned.»

License[edit | edit source]

Published by O'Reilly under a standard YouTube license

Summary[edit | edit source]

Transcript[edit | edit source]

So you will maybe know that last March HP pushed a security update to Office Jet and Office Jet Pro printers that people had bought and installed in the wild and tens if not hundreds of millions of these users were alerted to the presence of a security update by notification on the little LCD on the front of the printer. And most of them ran the update. HP assumes that about two thirds of users run updates when delivered this way, and I've spoken to some ex HP people on background and they think it's more like 95%. The update contained maybe some security code, we don't know, but it also contained a hidden counter that was ticking down to mid September, a couple of months ago, and when it detonated it activated this self destruct sequence in the printers that caused them to turn on this otherwise hidden feature that would cause the cartridges and the printer to do a cryptographic handshake, so that the printer could identify and reject third party ink cartridges and thus force people to buy their ink – at, you know, one million percent markup, more expensive than vintage champagne.

And no one was really sure what was happening when these printers all started spitting out their their pacifiers and they stopped accepting these third party ink cartridges. Some people took several known good cartridges and put them in, and when their printer wouldn't use any of them they assumed the printer was dead and threw it in the garbage. But after thousands of complaints flooded into the third party ink cartridge companies they started to figure out what exactly had happened in this business with the "Manchurian Candidate" security update that was waiting to wake up in September. HP had deliberately reached into millions of customers living rooms and offices and broken their lawfully acquired property to punish them for not ordering their affairs in the way that was most advantageous to HP shareholders.

Now you may think that this is just kind of a garden variety corporate ripoff. But it has much deeper implications thanks to a late twentieth century copyright law that has lain mostly dormant for the last fifteen years but has come into its own in this decade in a way that's really toxic. I'm talking of the Digital Millennium Copyright Act, or DMCA, which Congress passed in 1998. It's a kind of an gnarly hairball of copyright law, and it has a lot of different provisions, but the one I'm going to talk about today it's Section 1201. That's called 'the anti circumvention rule'. Under DMCA 1201 tampering with an access control that restricts access to a copyrighted work can potentially give rise to both civil and criminal liability. And not a little liability. The criminal provisions allow for a five year prison sentence and a 500 000 dollars fine, for a first offense, for tampering with with access controls on copyrighted works.

Now when 1201 passed it was mostly designed to protect the business models of companies like Sega and people who made DVD players. DVD vendors could add region codes to their disks and then they could use license agreements to make every manufacturer check for those region codes and respond to them. So you can't buy a DVD in one country and watch it in another. Now note here that buying a DVD from the company that made it at the price that they charged and playing it in a DVD player is not piracy. Right. It's the actual opposite of piracy. It's buying media and paying for it. And as for companies like Sega, well DMCA 1201 protected this Dreamcast business model which was kind of a forbearer to the App Store business model. If you were a company and you wanted to sell a Sega game to a Sega owner, Sega wanted you to press the CD on their presses, which charged a very high markup and effectively gave them a royalty on every game sold for the sake of platform. So this meant that everyone who played the Sega games paid a little hidden Sega tax and that was buried in the price of the disk. It meant that Sega could decide who could make software for the platform they created. And once again remember that buying a copyrighted game from the people who made it is the opposite of piracy. But because you had to break DRM to watch an out of region disc or to play a game that hadn't come from Sega's presses, those non-piracy acts could be punished under anti-piracy law.

Every business has a mix of commercial preferences and legal rights. Sega had the legal right to prevent you from making your own clone Sega and selling it at a price that was much cheaper than their own. It also had a commercial preference for being the gate keeper and toll collector on every game that was ever played on the Sega console. And by designing the consoles so that you had to bypass the DRM in order to play a third party game they could convert that commercial preference into a legal right. And this is practically a license to print money. Every CEO has at some point in their existence said 'gosh if only it was illegal to frustrate my business model. If only doing something that was suboptimal from my business perspective was a literal felony the world would be an easier place to navigate.'

So companies started to wake up to the potential of the DMCA and poke around the edges of it. In 2002-2003 a couple of hardware companies tried to get in on the act. One was Skylink, who make garage door openers. They had a commercial preference for being the only company that could sell spare garage door clickers. And so they designed their devices to only work with their original equipment and they charged a very high margin on their little clickers. And competitors figured out that a high margin from one company represents an opportunity for another company that wants to collect a lower margin. If they've got a 500% markup, maybe you can sell at a 100% markup. Still a pretty respectable business. Now that's normal. But Skylink sued them under Section1201 of the DMCA. They said that to make a compatible garage door clicker you had to break a DRM, and thus this otherwise legal and normal competitive activity that's a feature of all markets can become a literal federal crime. The court didn't buy it, they said the DMCA protects access controls on copyrighted works, and there isn't a copyrighted work in a clicker. So Skylink couldn't make breaking their business model into a felony.

Then came Lexmark, IBMs printer division back then. They made these laser toner cartridges that had a chip that ran a 12 byte program. Back then it was really expensive to put chips in cartridges so you had very small programs in them, and this recorded of the toner level on the cartridge. When the cartridge ran dry, even if you refilled it, the printer wouldn't use it because the chip was saying 'it's an empty cartridge' and the printer wouldn't accept it. So one of their competitors, a company called Static Controls, reverse engineered that 12 byte program – not hard, a 12 byte program – and they made a compatible product that reset the chip when you refill the cartridge. And like Skylink, Lexmark sued Static Controls under Section 1201 of the DMCA. But they said that unlike Skylink, they actually did have a copyrighted work in their cartridges that was being protected by their DRM. That was that 12 byte program. The court wasn't having it. They said 'yes, software can qualify for copyright, but 12 bytes is too little. It's below the threshold at which a new copyright is created'. As a funny story; the way this ends because Static Control was bought out by a big hedge fund and then they asked the hedge fund to buy Lexmark, and so now Lexmark is division of static controls.

That brings it back to HP. Back in 2003, when even a company that was making the kinds of crazy margins on their cartridges that Lexmark was couldn't afford to put a very powerful computer in their cartridges. But the cost of computing is crashed and it's not true anymore. Printer cartridges in an HP printer, they don't have twelve bytes of code, they have thousands of lines of code. So much code that there's no question at all as to whether or not an access control on an HP cartridge restricts access to a copyrighted work. If thousands of lines of code in HPs cartridge aren't copyrightable then neither is half the projects on Github that have the GPL attached to them, and if that's the case the GPL doesn't apply to them either.

[08:13] So when HP turns its commercial preference for you spending more on ink than you'd spend on vintage [??] into a legal right to reach into your house and reconfigure your legitimate, lawfully purchased property so that you have to do that, we're into some scary, new legal territory. Now it's not hard to figure out how to make a compatible cartridge that defeats this new DRM. They're doing something like an interactive protocol. What probably happens is that the printer generates a [??], it sends it to the cartridge, the cartridge signs it with a key that's in the chip, sends it back and they decide whether or not it's an original cartridge. So to make a compatible one that breaks that DRM, all you need to do is extract the secret from the chip. And if you want to have a crack at doing that you just go to any office supply store, or even a recycling depot, and you can have a chip that you can bring home or to your lab and you can decapitate you can fuzz it. You can stick it in an electron tunneling microscope and have a go at extracting those keys.

[9:14] DRM systems, they're built by skilled engineers who spend millions of dollars and years on them, and then they're broken in days by amateurs with hobbyist equipment. It's not because those engineers are dumb. It's because they're doing something dumb. Only an idiot hide signing keys and equipment that you then hand to your adversary. For the same reason that only an idiot would design even a really good bank safe that you would keep in the bank robbers living room instead of in the vault. So it's not hard to break HP's DRM. But it is legally terrifying. If you're an investor or a retailer or any other necessary parties in the value chain you should be justifiably afraid that this Fortune 100 company with billions in cash and its business model on the line will use the DMCA to punish you for helping their customers figure out how to defeat their business model on which their flagship product depends.

[10:11] Now, printer cartridges aren't the only software equipped devices in our world today. Software, as Marc Andreessen reminded us, is "eating the world". It's hard to overstate the cheapness of computer power and the number of things that have software in them today. I just read Bunny Wang's forthcoming book about hardware hacking, and he talks about what he learned when he was reverse engineering counterfeit SD cards. So SD cards, they're made from tons of recycled RAM, and that RAM tends to be pretty janky and have lots of bad sectors. And you could fix that at the factory in a quality assurance process, but it turns out that it's cheaper to put a whole system on a chip in every one of those little crappy you-lose-them-you-don't-even-bother-looking-for-them-SD-cards to act as a drive controller and mark off the bad sectors as they emerge.

[10:59] So embedded Linux has made this jump. It started in DVRs and it moved to home routers and then to network attached storages, and now to medical implants and smart light bulbs. That means that we have intelligence in everything and copyrightable works in everything. And the bad news is that there is no Internet of Things hardware business model. Hardware starts at something like a two percent margin and it declines steeply from there into a negative margin as soon as your product becomes popular enough for someone to clone in the Pacific Rim. So the only way that you can go out to a venture capitalist (VC) and get capital for your hardware business is to convince them that you're going to make your money not on the hardware, but on the ecosystem. You're going to be the only supplier of service or parts or consumables – or you'll be able to get data about the owners of these devices that no one else can get, by making them into privacy sucking surveillance devices. Now obviously making sure that people buy parts and services and consumables from you, that's not your legal right. That's just your commercial preference. But if you design the device so that adding third party ink, or putting a third party part in a car, or getting a third party mechanic to figure out what's wrong with your tractor, requires bypassing DRM – you can convert your commercial preference into an iron clad legal right. And that has meant that DRM has metastasized into domains that we never even imagined it would show up in. That's why DRM is now in cars and tractors, and pacemakers, and implanted defibrillators, and Cochlear implants, and insulin pumps, and thermostats, and voting machines, and this which debuted at the Consumer Electronics Show (CES) last year: That's the Internet of Things rectal thermometer, that has put DRM up our literal asses.

[12:47] So so far this is a consumer story. It's a story about whether or not you get what you think you paid for. But there is a security dimension here, and that's why I'm talking to you about it today. Because DRM works by hiding keys in user accessible equipment, and that requires obfuscation. You can't show the user how their equipment works if you're hoping to prevent them from reconfiguring their equipment. We don't have the normal crypto situation in DRM. Normally in crypto it's Alice and Bob and Carol. Alice and Bob need to talk to each other and Carol is trying to attack them to find out what they're saying. But in the DRM crypto model you just got Alice and Bob; Netflix is Bob, it wants to send you a video – because you're Alice – and it wants to send you the key and a network player for it. But it doesn't want you to figure out where the key is in that equipment that it just sent you. We have a technical name for this in security circles, this is called wishful thinking.

[13:47] But even so, well designed DRM becomes fertile soil for malware problems. In the mid 2000's Sony snuck rootkits onto six million CD's – 51 titles – that they shipped out to people. And if you put it in a Windows machine it had a second session on it that auto ran and patched your kernel with a rootkit that hid certain files and processes from you, so that they could run a secret program that tried to stop you from ripping CD's. That quickly spread. 2-300 000 US government and military networks were infected with it. And just as quickly malware writers realized that if they used the same trick Sony was using to disguise their anti ripping software, that if their malware landed on a computer that had already been infected by Sony's rootkit it could ride under the same cloak that Sony had designed. Because if you design a system that treats its owner as an attacker, that necessarily that system prevents the attacker from figuring out whether something bad is happening in the system.

[14:51] So remember that DRM doesn't actually work very well – this is the other side of the security picture. The DMCA necessarily has to prohibit disclosure of defects in DRM systems, because if you know about the errors the programmers made you can figure out how to start unraveling the DRM and jailbreaking the system. So last summer, 2015, the US Copyright Office held hearings on this to find out whether or not this was interfering with security researchers. After all America put a security researcher in jail once for disclosing defects in Adobe's e-book reader software and their DRM. And a "who' who" of security research wrote to the copyright office to say that they had discovered ghastly, dangerous defects in systems that people relied on for life and limb and that their general counsel hadn't allowed them to come forward with this because they were worried about DMCA liability. Researchers like Edward Felten who's now the deputy CTO of the White House, Jay Radcliffe from Rapid7, Alex Halderman from University of Michigan and Amherst, Matt Green from Johns Hopkins, Bruce Schneier, many others.

[16:00] Under the DMCA it's become the situation that researchers need permission from companies to disclose the defects that they find in their products without legal risk. Which gives companies a veto over embarrassing news about their own products. And for obvious reasons corporations are not the right custodians of facts that might embarrass them. And preventing disclosure does not prevent discovery. It just means that the vulnerabilities that you discover and can't tell us about don't become public knowledge until they become so widely exploited in the wild that you can't help but find out about them. This is why the Internet of Things dumpster fire has been allowed to rage. This is why Brian Krebs in September faced a 620 gigabit per second DoS attack in retaliation for outing a couple of petty Israeli Denial of Service attack criminals, who then were able to harness the Mirai Internet of Things worm to attack, or to direct an attack, that we would normally associate with nation state actors. But it didn't originate from China or Russia. It originated from a couple of dumdums running a crappy crimeware company. Now the source code for Mirai, that Internet of Things malware that was used to attack Krebs, that was dumped a week later. And the analysts who looked at it, they said it was amateurish and clumsy. And a week after that that amateurish and clumsy malware had found such a hospitable environment in the even more amateurish, even more clumsy environment of the Internet of Things that it had infected systems in every country on the planet with reliable electricity and internet service.

[17:45] Ten days after that Mirai was used to direct floods of traffic against core infrastructure; Level 3, Dyn DNS, Pay Pal, Twitter, Netflix, knocking out some of the Internet's best defended services. We're going to be fighting this fire for a long time. There is no obvious way to patch most of these systems, meaning even if the next generation fixes these problems, and even if we recall the old ones, we will still struggle with the millions of installed stupid "smart bulbs", PVR's, CCTV's, and rectal thermometers.

[18:15] But attacks that harness insecure devices to attack people other than their owners, that's just the beginning. The real risk comes when these devices – these devices that are designed to treat their owners as attackers and obfuscate their operations, that are designed to gather as much information about their owners as possible in case that turns out to be a business model, that are designed to be illegal to report vulnerabilities in – what happens when those devices are used to direct attacks against their owners? We know what that looks like, because we've seen it for a long time.

[18:47] You remember in 2013, Miss Teen USA, Cassidy Wolf, got drive-by-malware on her computer. A remote access trojan that allowed her attacker to secretly capture incidental, nude images of her as she walked in front of her laptop, as well as the passwords to social media accounts. And he tried to blackmail her, an underage, young woman, into performing live sex acts on camera or he would put these nude photos online. When they finally arrested him, because she went to the FBI, they found out that most of his victims had not gone to the FBI. He had well over 100, in several countries, including many minor children.

[19:19] You'll remember last summer, at Def Con, the 'belle of the ball' was the jeep hack. 1.4 million Chrysler jeeps recalled because it turned out that their wifi hot-spot-on-demand could be harnessed to control their brakes, steering, transmission and all other significant functions over the internet. January 2016 in San Francisco, a moms three year old kept saying that the phone in his room, that's what he called his baby monitor, was scaring him at night. One night as she passed by his room she heard some stranger's voice swearing at her child, and she walked into the room and the little camera on it swiveled around and some "rando" said 'Uh oh, mommy's here', and the voices stopped.

[20:04] With DMCA 1201 we have given companies every incentive they need to use DRM in their products and prevent us from figuring out whether or not there's something wrong with them until it's too late. But we've also given those companies the ability to end private property as we've understood it for hundreds and thousands of years because if the dead hand of the manufacturer can rest on your lawfully acquired property even after you've purchased it in full, ready to slap you anytime you commit the sin of not ordering your affairs to the maximal benefit of their shareholders, then you are not the owner of that property anymore. They are. We are one RFID system away from a dishwasher that won't take third party dishes. We are one vision system away from a toaster that won't take unauthorized bread.

[20:56] Now we have a name for systems where only one special class of people get to own property and everyone else has to rent it from them. That's called feudalism. And in feudalism everyone who's not a lord is a peasant or a vassal. A tenant farmer who is reliant on the mercy of the local lord for the ability to earn their living. In DRM feudalism the aristocracy, they're not even flesh and blood. They're artificial, immortal, trans-human life forms called 'limited liability corporations' that see us alternately as their food source and inconvenient gut flora. and it's only going to get worse.

[21:33] The World Wide Web Consortium (W3C), once the great bastion of open web standards, is working to standardize digital rights management for the core suite of HTML 5 browser standards. They're working on a project called "Encrypted Media Extensions". They're doing this as part of a wider project to make browsers into the control surface for the Internet of Things, to help sunset apps and the walled gardens that they represent and bring back the open, Federated Internet. When we told the World Wide Web Consortium that they shouldn't do this, that they shouldn't make the control surface for the Internet of Things off limits to security research, they said 'your problem isn't with DRM, it's with the DMCA'. So we said 'fine, make W3C membership contingent on promising not to abuse the DMCA and laws like it to attack security researchers, to attack people making lawful, interoperable products and to attack people who create accessibility features for products.' That came to a vote that closed last night.

[22:33] We proposed that the W3C should require its membership to promise not to turn this technical standard into an immortal, pluripotent legal weapon. That they'd have to promise not to abuse the DMCA and laws like it. And we were backed by some of the world's largest research organizations, like Oxford University, and some of the world's leading disability rights organizations, like the Royal National Institute for blind people, and by a list of literally hundreds of the world's leading security researchers, including honorary Ross Anderson, Bruce Schneier, Matt Green, Matt Blaise[?], and many others whose names you'll recognize.

[23:10] And now this is in the hands of the W3C executives who get to decide; Are they in the standards business, or are they in the business of arming the world's largest, most powerful corporations to decide who gets to improve their products, who can add accessibility features to them, and who gets to warn their customers about defects that put them at risk?

[23:30] So let's go back to printers. In 2011 a researcher named Ang Cui, who's at Columbia University here in New York, published a paper called "Print me if you dare" that detail the research he'd done into HP's laser printers. The first thing he discovered is that the way that you update the firmware on an HP printer is by sending it a document that has a hidden code that says "operating system starts here". Everything after that is not checked and is uploaded into the nonvolatile memory as the new operating system for the printer. So he created poisoned documents that would reflash any printer that printed them. Documents with names like "resume.doc" that you could send to the H.R. department. And after a printer had been compromised by one of his documents they would send him copies of everything that got sent to them. They would also collect separately and send to him anything that looked like a social security number or a credit card number. And they would also crawl the LAN for any machines with known vulnerabilities, compromise them and then open a reverse shell to him so that he could control the entire LAN, having punched through the firewall.

[24:35] He picked on HP because they had tens of millions of units in the fields. You rob banks because that's where the money is. And now HP has started using security updates to transmit these sneak attacks, these time bombs against their own customers own property. That means that for the first time these tens or hundreds of millions of devices in the field are owned by people who have a damn good reason not to run security updates.

[25:03] HP is a dress rehearsal for what the future of the Internet of vulnerable, illegal to audit things on fire looks like. Every incentive that HP had to remote break its customers property, that's present for all of those IoT companies. So we need to fix this. We need to adopt principles that expand on that EFF proposal to the W3C – that as a condition of membership you agree not to use the DMCA to attack security research – we need to expand it into a set of principles that we bring into all of our work.

[25:32] I'm going to suggest two principles for you today. The first one is that any time a system gets conflicting instructions from a remote party and its owner, every time the owner always wins. The second one is that true facts about the security of systems that people rely on are always legal to disclose. I charge you to be hardliners on these principles. If they're not calling you a fanatic and an unrealistic puritan about this you are not trying hard enough. If you aren't totally uncompromising in these principles, you are setting the stage for long term harms that are worse than any short term benefit you could gain by making that compromise. If you computerize the world and don't safeguard the users of computers from coercive control, history will remember you not as the heroes of progress, but as the blind handmaidens of tyranny.

[26:26] So how do we fix this? Well, we're not going to do it in onesies-twosies. No one of you is going to be able to solve this problem. Just like no one of you can recycle your way out of climate change. It's not a matter of individual choices, it's a matter of collective action that can make deep, structural changes in the way that our information economy works. And at the EFF we're doing something about this. We have this project, Apollo 1201, whose goal is to kill all the DRM in the world within a decade. We started with a lawsuit against the US government, challenging the constitutionality of Section 1201 of the DMCA. And our two clients in this suit; One is a Johns Hopkins security researcher named Matthew Green and the other is MIT Media Lab adjunct Bunny Wang.

[27:10] And this lawsuit is going to run for years to come. In so doing it's creating this new era in the history of DMCA 1201, an era of indeterminacy where it's not clear whether that law will be found to be enforceable. And while we're winding our way up to the Supreme Court risk tolerant designers, security researchers and entrepreneurs can short DRM and go long on a technologically free future by taking action based on our legal theories. And as legal protection for DRM erodes in the US, all those other countries that the US arm twisted into adopting their own versions of DMCA 1201, they'll have no good reason to keep the law on their books anymore. When Americans are jailbreaking and exporting jailbreaking tools, preserving legal protection for DRM in the UK or Canada or Hungary won't stop people in those countries from jail braking. It'll just mean that they only buy their jailbreaking tools from America, the country that made those countries promise not to have an industry that does this themselves. Suicide pacts are mutual. If the US pulls out, no one else will stay in. And that's how we're going to kill deer I'm not just here but everywhere.

[28:18] Now I'm a science fiction writer, and people ask me if I'm optimistic or pessimistic about the future. But if there's one thing being a science fiction writer has taught me it's that trying to predict the future is an idiot's game. Science fiction writers are like Texas marksman; we fire a shot gun into the side of a barn and then we draw a target around the place where the pellets has hit and tell everyone what a great shot we've been. We ignore all of those predictions we made that never came true. But in the wider sense, who cares what we think the future is going to be? I mean if you're optimistic and you think that this is just a temporary speed bump on the way to a future in which technology allows us to work together to make a better world for everyone, then you should do everything you can every day to make sure that comes true. And if you're pessimistic, if you think that all of this stuff is only going to get worse. That entertainment law is going to usher in an era of unparalleled surveillance and control, that we will be Huxleyed into the full Orwell, then you should get up every morning and do everything you can to stop that from happening.

[29:20] So rather than being optimistic or pessimistic, I'm going to ask you to be hopeful. Hope is surveying the landscape for a step that you can take that makes things a little better, and taking that step to see if it brings you to a vantage point from which you can see another step. Hope is why when your ship sinks in the ocean you tread water, even if you don't think that you're going to be picked up, because everyone that was ever picked up treaded water until rescuers arrived. It is the necessary but insufficient precondition for survival.

[29:51] And I'm going to suggest some hopeful things you can do. The first one is financial. Denise Cooper is this open source theorist and activist. She's at Pay Pal now. And Denise said 'you know it can be really disheartening to wake up in the morning and realize that you're spending money every day with companies whose Alpha and Omega is destroying the future you want to live with. It can make you feel hopeless. But what I do is at the end of every month I add up all the money I've given to net neutral-cidal telephone companies, to hardware companies whose Alpha and Omega is DRM, to online services that abuse the DMCA and the Computer Fraud and Abuse Act, and I take that money and I give it to an organization that's struggling to build the better future. I try to hedge my bet.'

[30:37] And I have a place where I think you should hedge your bet. Obviously I'm partisan here with the EFF. I've worked for them for fifteen years. They don't give me money. I get my money from MIT as activist in residence. But I've watched how they spend their money. I've never seen an organization be more efficient. They really know how to squeeze a dollar until it hollers. But the good news is that there's lots of organizations that you can support that do this work, even if it's not EFF, and you can spread your money around. There's the Free Software Foundation and Demand Progress and Creative Commons, Software Freedom Conservancy, Software Freedom Law Center, and let's not forget the ACLU, who's done so much important work in this election season. They are suing the US government to invalidate parts of the Computer Fraud and Abuse Act, which is the law that Aaron Swartz was prosecuted under.

So that's the financial thing that you can do. But I've got an even harder and more ambitious project that I'm going to ask you to to think about undertaking. That's to find two deep nerds. Not civilians, not people who don't understand what DRM means or what the DMCA is or how crypto works. Find two deep nerds who already understand all that stuff, and explain what I've just told you to them. Because EFF has tens of thousands of members, Slashdot has hundreds of thousands of readers, Hacker News has millions of readers, and Reddit has tens of millions of readers. We have a lot of people who are ready to understand what the hell all this stuff means, people who you don't have to give the technical education to. And we can build a movement by bringing those people along and having them explain to the people they love what they can do too. So have that conversation with two people in the next week, and then one week later call them up and ask them if they've thought about it and if they're willing to have this conversation with two more people.

[32:19] There's a lot on the line here. We're trying to figure out whether we're going to make a future where our devices are designed to obey us, where we're allowed to warn each other about the defects lurking in those devices. And none of us are going to get to choose individually whether we get that future but together we do have a chance. And there's too much at stake not to fight with everything we have. Thank you.