How Stupid Laws and Benevolent Dictators can Ruin the Decentralized Web, too
[Jump to 5:12:46] – Or watch on the Internet Archive.
June 9, 2016
«At yesterday’s Internet Archive Decentralized Web Summit, the afternoon was given over to questions of security and policy.
I gave the opening talk, “How Stupid Laws and Benevolent Dictators can Ruin the Decentralized Web, too,” which was about “Ulysses pacts“: bargains you make with yourself when your willpower is strong to prevent giving into temptation later when you are tired or demoralized, and how these have benefited the web to date, and how new, better ones can protect the decentralized web of the future.»
Official article based on talk
«He called on the audience to act now to make a Ulysses pact for the decentralized web, because everything eventually fails or falls on hard times. If we want to make sure that the principles and values we hold dear survive, we need to design the systems that embody those principles so that they can’t be compromised of weakened. In other words, we need to build things now so that five or ten or twenty years from now, when what we’ve built is successful and someone asks us to add a backdoor or insert malware or track our users, it simply won’t be possible (for either technological or legal or monetary reasons)—no matter how much outside pressure we’re under.
After all, “The reason the web is closed today is because…people just like you made compromises that seemed like the right compromise to make at the time. And then they made another compromise, a little one. And another one.” He continued, pointing out that “We are, all of us, a mix of short-sighted and long-term…We must give each other moral support. Literal support to uphold the morals of the decentralized web, by agreeing now on what an open decentralized web is.” Only by doing this will we be able to resist the siren song of re-centralization.
And what sort of principles should we agree to? Cory suggests two. First, when a computer receives conflicting instructions from its owner and from a remote party, the owner’s wishes should always take precedence. In other words, no DRM (that means you, W3C). Second, disclosing true facts about the security of systems that we rely upon should never ever be illegal. In other words, we need to work to abolish things like the DMCA, which create legal uncertainty for security researchers disclosing vulnerabilities in systems locked behind DRM. The crowd’s response to this passionate call to action? A standing ovation.» By EFF’s Jeremy Gillula and Noah Swartz.
Please help with
Feel free to help out with any of this on the transcript bellow:
- Add sub-headings to make it easier to read
- Add timestamps at the start of each paragraph to make it easier to use
- Check for typos and errors
- Fix tagged issues
- Add explanatory links to things that not everyone might understand
- Add more tags
[00:38] Thanks for having us here, Wendy and Brewster. This is fabulous. It's like being back at the O'Reilly P2P conference back in 1999, some of the same faces, too. So, as you might imagine, I'm here to talk to you about dieting advice. If you ever want to go on a diet, the first thing you should really do is throw away all your Oreos.
[01:01] It's not that you don't want to lose weight when you raid your Oreo stash in the middle of the night. It's just that the net present value of tomorrow's weight loss is hyperbolically discounted in favor of the carbohydrate rush of tonight's Oreos. If you're serious about not eating a bag of Oreos your best bet is to not have a bag of Oreos to eat. Not because you're weak willed. Because you're a grown up. And once you become a grown up, you start to understand that there will be tired and desperate moments in your future and the most strong-willed thing you can do is use the willpower that you have now when you're strong, at your best moment, to be the best that you can be later when you're at your weakest moment.
[01:51] And this has a name: It's called a Ulysses pact. Ulysses was going into Siren-infested waters. When you go into Siren-infested waters, you put wax in your ears so that you can't hear what the Sirens are singing, because otherwise you'll jump into the sea and drown. But Ulysses wanted to hear the Sirens. And so he came up with a compromise: He had his sailors tie him to the mast, so that when he heard the call of the Sirens, even though he would beg and gibber and ask them to untie him, so that he could jump into the sea, he would be bound to the mast and he would be able to sail through the infested waters.
This is a thing that economists talk about all the time, it's a really critical part of how you build things that work well and fail well. Now, building a Web that is decentralized is a hard thing to do, and the reason that the web ceases to be decentralized periodically is because it's very tempting to centralize things. There are lots of short term gains to be had from centralizing things and you want to be the best version of yourself, you want to protect your present best from your future worst.
The reason that the Web is closed today is that people just like you, the kind of people who were at the P2P conference in 1999, the kind of people who went to Doug Engelbart's demo in 1968, the kind of people who went to the first Hackers conference, people just like you, made compromises, that seemed like the right compromise to make at the time. And then they made another compromise. Little compromises, one after another.
And as humans, our sensory apparatus is really only capable of distinguishing relative differences, not absolute ones. And so when you make a little compromise, the next compromise that you make, you don't compare it to the way you were when you were fresh and idealistic. You compare it to your current, "stained" state. And a little bit more stained hardly makes any difference. One compromise after another, and before you know it, you're suing to make APIs copyrightable or you’re signing your name to a patent on one-click purchasing or you're filing the headers off of a GPL library and hope no one looks too hard at your binaries. Or you're putting a backdoor in your code for the NSA.
And the thing is: I am not better than the people who made those compromises. And you are not better than the people who made those compromises. The people who made those compromises discounted the future costs of the present benefits of some course of action, because it's easy to understand present benefits and it's hard to remember future costs.
You're not weak if you eat a bag of Oreos in the middle of the night. You're not weak if you save all of your friends' mortgages by making a compromise when your business runs out of runway. You're just human, and you're experiencing that hyperbolic discounting of future costs because of that immediate reward in the here and now. If you want to make sure that you don't eat a bag of Oreos in the middle of the night, make it more expensive to eat Oreos. Make it so that you have to get dressed and find your keys and figure out where the all-night grocery store is and drive there and buy a bag of Oreos. And that's how you help yourself in the future, in that moment where you know what's coming down the road.
The answer to not getting pressure from your bosses, your stakeholders, your investors or your members, to do the wrong thing later, when times are hard, is to take options off the table right now. This is a time-honored tradition in all kinds of economic realms. Union negotiators, before they go into a tough negotiation, will say: "I will resign as your negotiator, before I give up your pension." And then they sit down across the table from the other side, and the other side says "It's pensions or nothing". And the union leaders say: "I hear what you're saying. I am not empowered to trade away the pensions. I have to quit. They have to go elect a new negotiator, because I was elected contingent on not bargaining away the pensions. The pensions are off the table."
The GPL does this. Once you write code, with the GPL it's locked open, it's irrevocably licensed for openness and no one can shut it down in the future by adding restrictive terms to the license. The reason the GPL works so well, the reason it became such a force for locking things open, is that it became indispensable. Companies that wanted to charge admission for commodity components like operating systems or file editors or compilers found themselves confronted with the reality that there's a huge difference between even a small price and no price at all, or no monetary price. Eventually it just became absurd to think that you would instantiate a hundred million virtual machines for an eleventh of a second and get a license and a royalty for each one of them.
And at that point, GPL code became the only code that people used in cloud applications in any great volume, unless they actually were the company that published the operating system that wasn't GPL'd. Communities coalesced around the idea of making free and open alternatives to these components: GNU/Linux, Open- and LibreOffice, git, and those projects benefited from a whole bunch of different motives, not always the purest ones. Sometimes it was programmers who really believed ethically in the project and funded their own work (Zooko talked about that a little yesterday), sometimes talent was tight and companies wanted to attract programmers, and the way that they got them to come through the door is by saying: "We'll give you some of your time to work on an ethical project and contribute code to it."
Sometimes companies got tactical benefits by zeroing out the margins on their biggest competitor's major revenue stream. So if you want to fight with Microsoft, just make Office free. And sometimes companies wanted to use but not sell commodity components. Maybe you want to run a cloud service but you don't want to be in the operating system business, so you put a bunch of programmers on making Linux better for your business, without ever caring about getting money from the operating system. Instead you get it from the people who hire you to run their cloud.
Everyone of those entities, regardless of how they got into this situation of contributing to open projects, eventually faced hard times, because hard times are a fact of life. And systems that work well, but fail badly, are doomed to die in flames. The GPL is designed to fail well. It makes it impossible to hyperbolically discount the future costs of doing the wrong thing to gain an immediate benefit. When your investor or your acquisition suitor or your boss say "Screw your ethics, hippie, we need to make payroll", you can just pull out the GPL and say: "Do you have any idea how badly we will be destroyed if we violate copyright law by violating the GPL?"
It's why Microsoft was right to be freaked out about the GPL during the Free and Open Source wars. Microsoft's coders were nerds like us, they fell in love with computers first, and became Microsoft employees second. They had benefited from freedom and openness, they had cated out BASIC programs, they had viewed sources, and they had an instinct towards openness. Combining that with the expedience of being able to use FLOSS, like not having to call a lawyer before you could be an engineer, and with the rational calculus, that if they made FLOSS, that when they eventually left Microsoft they could keep using the code that they had made there, meant that Microsoft coders and Microsoft were working for different goals. And the way they expressed that was in how they used and licensed their code.
This works so well that for a long time, nobody even knew if the GPL was enforceable, because nobody wanted to take the risk of suing and setting a bad precedent. It took years and years for us to find out in which jurisdictions we could enforce the GPL.
That brings me to another kind of computer regulation, something that has been bubbling along under the surface for a long time, at least since the Open Source wars, and that's the use of Digital Rights Management (DRM) or Digital Restrictions Management, as some people call it. This is the technology that tries to control how you use your computer. The idea is that you have software on the computer that the user can't override. If there is remote policy set on that computer that the user objects to, the computer rejects the user's instruction in favor of the remote policy. It doesn't work very well. It's very hard to stop people who are sitting in front of a computer from figuring out how it works and changing how it works. We don't keep safes in bank robbers' living rooms, not even really good ones.
But we have a law that protects it, the Digital Millennium Copyright Act (DMCA), it's been around since 1998 and it has lots of global equivalents like section 6 of the EUCD in Europe, implemented all across the EU member states. In New Zealand they tried to pass a version of the DMCA and there were uprisings and protests in the streets, they actually had to take the law off the books because it was so unpopular. And then the Christchurch earthquake hit and a member of parliament reintroduced it as a rider to the emergency relief bill to dig people out of the rubble. In Canada it's Bill C-11 from 2011. And what it does is, it makes it a felony to tamper with those locks, a felony punishable by a 500,000 dollars fine and five years in jail for a first offense. It makes it a felony to do security auditing of those locks and publish information about the flaws that are present in them or their systems.
This started off as a way to make sure that people who bought DVDs in India didn't ship them to America. But it is a bad idea whose time has come. It has metastasized into every corner of our world. Because if you put just enough DRM around a product that you can invoke the law, then you can use other code, sitting behind the DRM, to control how the user uses that product, to extract more money. GM uses it to make sure that you can't get diagnostics out of the car without getting a tool that they license to you, and that license comes with a term that says you have to buy parts from GM, and so all repair shops for GM that can access your diagnostic information have to buy their parts from GM and pay monopoly rents.
We see it in insulin pumps, we see it in thermostats and we see it in the "Internet of Things rectal thermometer", which debuted at CES this year, which means we now have DRM restricted works in our asses. And it's come to the web. It's been lurking in the corners of the web for a long time. But now it's being standardized at the World Wide Web Consortium (W3C) to something called Encrypted Media Extensions (EME). The idea of EME is that there is conduct that users want to engage in that no legislature in the world has banned, like PVR'ing their Netflix videos. But there are companies that would prefer that conduct not to be allowed. By wrapping the video with just enough DRM to invoke the DMCA, you can convert your commercial preference to not have PVRs (which are no more and no less legal than the VCR was when in 1984 the Supreme Court said you can record video off your TV) into something with the force of law, whose enforcement you can outsource to national governments.
What that means, is that if you want to do interoperability without permission, if you want to do adversarial interoperability, if you want to add a feature that the manufacturer or the value chain doesn't want, if you want to encapsulate Gopher inside of the Web to launch a web browser with content form the first day, if you want to add an abstraction layer that lets you interoperate between two different video products so that you can shop between them and find out which one has the better deal, that conduct, which has never been banned by a legislature, becomes radioactively illegal.
It also means, that if you want to implement something that users can modify, you will find yourself at the sharp end of the law, because user modifiability for the core components of the system is antithetical to its goals of controlling user conduct. If there's a bit you can toggle that says "Turn DRM off now", then if you turn that bit off, the entire system ceases to work. But the worst part of all is that it makes browsers into no-go zones for security disclosures about vulnerabilities in the browser, because if you know about a vulnerability you could use it to weaken EME. But you could also use it to attack the user in other ways.
Adding DRM to browsers, standardizing DRM as an open standards organization, that's a compromise. It's a little compromise, because after all there's already DRM in the world, and it's a compromise that's rational if you believe that DRM is inevitable. If you think that the choice is between DRM that's fragmented or DRM that we get a say in, that we get to nudge into a better position, then it's the right decision to make. You get to stick around and do something to make it less screwed up later, as opposed to being self-marginalized by refusing to participate at all.
But if DRM is inevitable, and I refuse to believe that it is, it's because individually, all across the world, people who started out with the best of intentions made a million tiny compromises that took us to the point where DRM became inevitable, where the computers that are woven into our lives, with increasing intimacy and urgency, are designed to control us instead of being controlled by us. And the reasons those compromises were made is because each one of us thought that we were alone and that no one would have our back, that if we refuse to make the compromise, the next person down the road would, and that eventually, this would end up being implemented, so why not be the one who makes the compromise now.
They were good people, those who made those compromises. They were people who were no worse than you and probably better than me. They were acting unselfishly. They were trying to preserve the jobs and livelihoods and projects of people that they cared about. People who believed that others would not back their play, that doing the right thing would be self-limiting. When we're alone, and when we believe we're alone, we're weak.
It's not unusual to abuse standards bodies to attain some commercial goal. The normal practice is to get standards bodies to incorporate your patents into a standard, to ensure that if someone implements your standard, you get a nickel every time it ships. And that's a great way to make rent off of something that becomes very popular. But the W3C was not armtwisted about adding patents back into standards. That's because the W3C has the very best patents policy of any standards body in the world. When you come to the W3C to make a standard for the web, you promise not to use your patents against people who implement that standard. And the W3C was able to make that policy at a moment in which it was ascendant, in which people were clamoring to join it, in which it was the first moments of the Web and in which they were fresh.
The night they went on a diet, they were able to throw away all the Oreos in the house. They were where you are now, starting a project that people around the world were getting excited about, that was showing up on the front page of the New York Times. Now that policy has become the ironclad signifier of the W3C. What's the W3C? It's the open standards body that's so open, that you don't get to assert patents if you join it. And it remains intact.
How will we keep the DMCA from colonizing the Locked Open Web? How will we keep DRM from affecting all of us? By promising to have each others' backs. By promising that by participating in the Open Web, we take the DMCA off the table. We take silencing security researchers, we take blocking new entrances to the market off the table now, when we are fresh, when we are insurgent, before we have turned from the pirates that we started out as into the admirals that some of us will become. We take that option off the table.
The EFF has proposed a version of this at the W3C and at other bodies, where we say: To be a member, you have to promise not to use the DMCA to aggress against those, who report security vulnerabilities in W3C standards, and people who make interoperable implementations of W3C standards. We've also proposed that to the FDA, as a condition of getting approval for medical implants, we've asked them to make companies promise in a binding way never to use the DMCA to aggress against security researchers. We've taken it to the FCC, and we're taking it elsewhere. If you want to sign an open letter to the W3C endorsing this, email me: firstname.lastname@example.org
But we can go further than that, because Ulysses pacts are fantastically useful tools for locking stuff open. It's not just the paper that you sign when you start your job, that takes a little bit of money out of your bank account every month for your 401k, although that works, too. The U.S. constitution is a Ulysses pact. It understands that lawmakers will be corrupted and it establishes a principal basis for repealing the laws that are inconsistent with the founding principles as well as a process for revising those principles as need be.
A society of laws is a lot harder to make work than a society of code or a society of people. If all you need to do is find someone who's smart and kind and ask them to make all your decisions for you, you will spend a lot less time in meetings and a lot more time writing code. You won't have to wrangle and flame or talk to lawyers. But it fails badly. We are all of us a mix of short-sighted and long-term, depending on the moment, our optimism, our urgency, our blood-sugar levels...
We must give each other moral support. Literal moral support, to uphold the morals of the Decentralized Web, by agreeing now what an open internet is and locking it open. When we do that, if we create binding agreements to take certain kinds of conduct off the table for anything that interoperates with or is part of what we're building today, then our wise leaders tomorrow will never be pressurized to make those compromises, because if the compromise can't be made, there is no point in leaning on them to make it.
We must set agreements and principles that allow us to resist the song of the Sirens in the future moments of desperation. And I want to propose two key principles, as foundational as life, liberty, and the pursuit of happiness or the First Amendment:
1) When a computer receives conflicting instructions from its owner and from a remote party, the owner always wins.
Systems should always be designed so that their owners can override remote instructions and should never be designed so that remote instructions can be executed if the owner objects to them. Once you create the capacity for remote parties to override the owners of computers, you set the stage for terrible things to come. Any time there is a power imbalance, expect the landlord, the teacher, the parent of the queer kid to enforce that power imbalance to allow them to remotely control the device that the person they have power over uses.
You will create security risks, because as soon as you have a mechanism that hides from the user, to run code on the user's computers, anyone who hijacks that mechanism, either by presenting a secret warrant or by breaking into a vulnerability in the system, will be running in a privileged mode that is designed not to be interdicted by the user.
If you want to make sure that people show up at the door of the Distributed Web asking for backdoors, to the end of time, just build in an update mechanism that the user can't stop. If you want to stop those backdoor requests from coming in, build in binary transparency, so that any time an update ships to one user that's materially different from the other ones, everybody gets notified and your business never sells another product. Your board of directors will never pressurize you to go along with the NSA or the Chinese secret police to add a backdoor, if doing so will immediately shut down your business.
Throw away the Oreos now.
Let's also talk about the Computer Fraud and Abuse Act. This is the act that says if you exceed your authorization on someone else's computer, where that authorization can be defined as simply the terms of service that you click through on your way into using a common service, you commit a felony and can go to jail. Let's throw that away, because it's being used routinely to shut down people who discover security vulnerabilities in systems.
2) Disclosing true facts about the security of systems that we rely upon should never, ever be illegal.
We can have normative ways and persuasive ways of stopping people from disclosing recklessly, we can pay them bug bounties, we can have codes of conduct. But we must never, ever give corporations or the state the legal power to silence people who know true things about the systems we entrust our lives, safety, and privacy to.
These are the foundational principles. Computers obey their owners, true facts about risks to users are always legal to talk about. And I charge you to be hardliners on these principles, to be called fanatics. If they are not calling you puritans for these principles you are not pushing hard enough. If you computerize the world, and you don't safeguard the users of computers form coercive control, history will not remember you as the heroes of progress, but as the blind handmaidens of future tyranny.
This internet, this distributed internet that we are building, the Redecentralization of the Internet, if it ever succeeds, will someday fail, because everything fails, because overwhelmingly, things are impermanent. What it gives rise to next, is a function of what we make today. There's a parable about this:
The state of Roman metallurgy in the era of chariots, determined the wheel base of a Roman chariot, which determined the width of the Roman road, which determined the width of the contemporary road, because they were built atop the ruins of the Roman roads, which determined the wheel base of cars, which determined the widest size that you could have for a container that can move from a ship, to a truck, to a train, which determined the size of a train car, which determined the maximum size of the Space Shuttle's disposable rockets.
Roman metallurgy prefigured the size of the Space Shuttle's rockets.
This is not entirely true, there are historians who will explain the glosses in which it's not true. But it is a parable about what happens when empires fall. Empires always fall. If you build a glorious empire, a shining city on a hill, as a former governor of California liked to talk about, if you build a good empire, an empire we can all be proud to live in, it will someday fall. You cannot lock it open forever. The best you can hope for is to wedge it open until it falls, and to leave behind the materials, the infrastructure that the people who reboot the civilization that comes after ours will use to make a better world.
[27:50] A legacy of technology, norms and skills that embrace fairness, freedom, openness and transparency, is a commitment to care about your shared destiny with every person alive today and all the people who will live in the future.
Questions and Answers section
[28:17] Thank you.
[28:20] It's very nice of you, but I want to get some questions in and we have two and a half minutes. Can we start with a question from a woman because it's always a sausage fest. We'll alternate if we have time. Long rambling statements followed by what do you think of that are questions but not good ones Denise, please.
[28:45 - Denise:] So when you talked about the long term effect of the choices we make now, I kept thinking about what happens when things are sold to something else. I work for Sun and we actually did kind of [DU - 29:03] and then we got sold to the most evil place we could possibly be sold to, and they just undermined everything. So how do we keep that from happening? I mean there were valiant efforts, you know people went to the E.C.[? - 29:18] and said this is going to screw up my sequel and they came up with an accommodation and another accommodation and another accommodation. How can we stop that stuff?
[29:27 - Cory:] So there are two levels at which you bind your conduct. The first as in the specific and the second is at the principle. So in the specific you might say a condition of using some library, participating in some consortium, contributing to something, is to irrevocably promise never to abuse your patents against people who use that. That's a really good start and to the extent that you're in a firm that's joining say the Twitter open patent pool that's great news, I'm glad for you to be doing that and they're doing that.
[30:01 - Cory:] But the next piece is the harder piece, which is coming up with a principle that says "Any step that restricts the thing that using a patent would restrict is also off the table." And defining that is really hard because people will argue about whether or not for example exercising DMCA rights to shut down interoperable implementations is the same as using patent rights to shutdown interoperable limitations. I suspect if you're making the interoperable implementation it doesn't matter which law that only your general counsel understands is is being used to shut you down. What matters is that you're being shut down. But it invites endless wrangling about whether two things are equivalent.
[20:51 - Cory:] But if you look at the way that the Constitution is structured and then the laws that depend from it are structured; it starts with principles and then implements those principles in laws and then those laws are referred back to the principles to see whether they match. So when the NSA prohibited the publication of strong crypto we represented Daniel J. Bernstein to the Ninth Circuit where we argued that the First Amendment protected his right to publish source code. And though the founders had never heard of source code we were able to use the specific principle to overturn the law. And we have also been able to go in the opposite direction where laws were enacted, like laws that protect equal access under the law, anti discriminatory laws, because they are consistent with those principles.
[31:43 - Cory:] So you have to do both: you need specific conduct that is taken off the table and you need principles to refer to for future conduct that will be taken off the table, and you need an amendment process to add that future conduct to it as circumstances change.
[31:57 - Cory:] David.
[32:01 - David:] I've been extremely interested in the DMCA, so since you know more about it than anybody I have a question that might be useful to other people. It seems that the DMCA is structured so that one person, the Librarian of Congress, who is appointed for life, is the only person who can decide when the DMCA does not apply. And I'm curious as to the structure of getting DMCA not to apply rather than removing it entirely.
[32:33 - Cory:] Right, how do we fix it structurally instead of onesies-twosies?
[32:36 - David:] In other words can we can we fix it by just getting someone other than the Librarian of Congress to be the…
[32:43 - Cory:] So, oh my god do I wish the Librarian of Congress had the power to take the DMCA off the table. It's actually significantly worse than that. What the Librarian of Congress has the authority to do is every three years hear petitions for uses that you're allowed to make, but not to grant you the right to make the tool to make the use. So the Librarian of Congress just said like security researchers are allowed to break DRM to do security research, but they're not allowed to make tools or trafficking tools or tell anyone how their tools work in order to do that security research in order to break the DMCA.
[33:15] In Norway, their version of this, because remember this is global, and the fact that the copyright office took some stuff off the table doesn't help the Norwegians or anyone else. The U.S. Trade Representative is like Patient Zero in an epidemic of shitty Internet law and everyone who trades with the Internet is stuck in this horrible regime. And in Norway they said blind people are allowed to break the DRM on e-books. But no one is allowed to make a tool to help a blind person take the DRM off e-books, and you are not allowed to share a tool if you are blind and you make a tool to take the DRM off e-books – you're not allowed to share it with other blind people, right?
[33:50] So like holy crap is it much worse than you thought it was. Because those exemptions create facts and evidence about the harms of the DMCA, but they don't solve those harms because they still make the tools, and trafficking of them, illegal.
[34:09] The way we are going to change this in legislatures is iteratively by picking it off around the edges and using that to influence debates in the middle, and then using that debate to pick it off more around the edge and then going back into the middle. So we'll get the FCC and the FDA to take the DMCA off the table for medical implants and set top boxes. And then we're going to go to Congress when they are, as they are right now, debating reform to Section 1201 of the DMCA and we'll say "look at how stupid this law is the the FDA and the FCC just took it off the table," and Congress will table reforms to it that won't get passed and then we'll use the fact that there were reforms tabled in Congress that had significant support but didn't pass to go to three more organizations and get it taken off the table. And then we'll use that to make Congress hold more hearings right. And we're going to go back and forth, and while we're doing that we have some pending litigation we're going to announce very soon now that will challenge the constitutionality of the DMCA all together. and every single one of those pieces are going to go together.
[35:17] So it looks like I'm out of time. Thank you all very much.
[35:20] I'd be remiss if I failed to remind you that EFF is a members supported nonprofit. I also would like to remind you that if you'd like to sign onto a letter endorsing these principles for web standards. Please send me an e-mail: email@example.com. Thank you.