1984 in the Making: Stealthy Invasion of Consumer Rights and Privacy by ICT Corporations
Published in 2007 台灣資訊社會研究學會年會暨論文研討會
Abstract
For whom does your computer work? Against whom does your computer guard its content and services? "Why? Me and unintended intruders, of course!" Exactly. Except that the answers are in the wrong order in regard to the questions. Digital Rights Management guards the contents that a computer holds against the computer owner, whereas Trusted Computing works for an ICT mega corporation and is designed to guard the ordinary services of the operating system against, again, the computer owner when a competitor's program is brought into execution. The universities are largely silent about this invasion on consumer rights and privacy while the main stream media is in addition conspicuously supportive of these technologies. Combined with free-speech-challenging laws such as the anti-circumvention provision in DMCA, these technologies are taking the society into the Big-Brother world of the book "1984" by George Orwell.
Keywords: DRM, TC, DMCA, privacy, human rights
Introduction
In George Orwell's famous fiction "1984", the society is under constant surveillance by the Big Brother through the omnipresent telescreens, a non-stoppable device that not only forces government agenda into the individuals' eyes and ears but also picks up their images and conversations wherever they go in the city. Such invasion of privacy and consumer rights is stealthily becoming reality under the camouflage of copyright protection and virus rejection, in the name of guarding the interests of consumers. Social movement workers in the free culture communities have cried out loud on the internet but the universities and the media largely have not heeded the warning. The general public thus remain uninformed of the looming threats.
Specifically, Digital Rights Management (DRM) and Trusted Computing (TC) were initially designed to restrict unauthorized copying of digital materials, but are beginning to be used to facilitate the surveillance and even restrictions over consumers by the ICT mega corporations. These designs are technically flawed because they try to work against the unstoppable free flow of information. It is as likely to fail in the face of nature's force as technological attempts to restrict a certain profitable crop to be grown only by citizens holding the passport of a certain nation. Ethical controversies of their aims aside, such attempts cannot succeed without heavy legal intervention. Yet laws produced under such circumstances likely will create inconveniences or even illegalize otherwise perfectly ordinary activities of citizens. In the case of copy-fighting, it is the freedom of expression that is first and directly under attack by the anti-circumvention provision of the Digital Millennium Copyright Act (DMCA), a USA law that was proposed by the ICT mega corporations and has caused many harms since 1998.
There are little or no high profile public discussions and debates surrounding these issues outside of the internet despite their potential far-reaching damages to the society. It is important that these issues be brought to the attention of the general public in clearly understandable ways. Most importantly, they need to be cast in the light of the human right considerations instead of being hidden behind the veil of a full load of technical jargons that prevent non computer scientists from understanding and joining the discussions.
Digital Rights Management
DRM [1] is not one single piece of technology. In its most common form, it is a mechanism that involves the cooperation between two application programs. Suppose that Steve creates a piece of digital document DD using an authoring application X. DD can be a text file, a picture, a sound clip, or a movie. Somehow DD ends up on Melinda's computer, and she fires up an application Y to display or play DD. The goal is startlingly naive: Y is to obey whatever restrictions X has placed on the way in which DD is to be used, for example how frequently DD can be played, into which other format DD can be converted, and so on. If X dictates that DD cannot be printed, for example, then Y is supposed to follow the order of X (which might or might not be given by Steve), likely against the will of Melinda, Y's owner. If Melinda's copy of DD is illegally obtained, then Y is supposed to refuse to play DD completely.
Let's pause for a moment before further discussing the mechanism. Let's think about it in terms of consumer rights and justice. Was there any precedence in any industry that a product is designed to work purposefully against the will of the consumer, as Y is supposed to do in the DRM case? Is it ever justifiable that a product be designed this way in order to prevent its possible illegal uses? Think about alcohol, which might be misused before taking the driver seat. Think about a knife, which might be directly misused to cause injury or even death of human beings. Some people might argue that such design would be worth it if it were technically feasible to do so. Others are quick to come up with scenarios that perfectly justify such "misuses" of these products or tools. Such design would raise controversies even in life-and-death situations. What about water then, which might be used illegally to grow marijuana? Is it hence justified that a bottle of water be designed to work against the will of the buyer? It is important to emphasize at this point that there is no way that the conditions and restrictions be perfectly stated to cover every imaginable reason why it is not only legal but also morally imperative that this bottle of water be used to grow marijuana. The only reasonable way of doing things, for centuries in most major civilizations, is to let Melinda, the owner of the tool, to decide for herself and be held responsible for its consequences, instead of letting another person (plus some hidden entities such as the author of software X) to decide for Melinda by instructing Melinda's own tool to work against her will.
Back to the mechanism. There are many programmers in the world who are capable of creating Y or its equivalent. Not all of them will agree to abide by the dictatorship of X. In fact it is reasonable to conjecture that the more conscientious ones will refuse to do so, but conscience seems subjective and irrelevant in the academic discussions so let's not digress. Therefore every DRM technology has some "encryption" mechanism that locks DD in a "secret", mangled file format. The Y program must present a set of keys before being allowed to unlock (unmangle) DD, and this set of keys are not to be easily obtained by any programmer. Note that we are not talking about keys that Melinda create and distribute to her intended audience. That part of protecting documents is independent of the DRM mechanism and is irrelevant in our discussions. The fact that a DRM'ed file cannot possibly be stored in a transparent and open file format should be reason enough for big organizations and governments, and especially libraries, to reject its use. [2] There is no way to justify this way of storing archives that are supposed to last for decades or even centuries.
But again let's not digress. The issue at hand now, is not how DRM harms its users, but three of its unrealistic assumptions about the world.
Firstly, the design of DRM is based on the premise (A) that the set of keys to be presented by Y before it is allowed to unlock DD can be safely and secretly shared by all companies which produce Y programs that agree to obey the dictatorship of X. Yes, there are complicated ways to distribute certain keys only to certain Y-producing companies, so that when certain keys are compromised (revealed to the world at large), they can be disabled and henceforth only some of the competing Y programs become inoperable. [3] But the technical details are irrelevant. The point is: it requires cooperation, and yet it also requires either keeping secrets within a large number of people or keeping secrets between co-workers, or both. By this I mean the following. Consider some company that creates the Y software for sale. It receives the keys from some central controlling agency of the DRM technology, conceivably an organization with members from both the media industry and the ICT industry. It has to keep the keys secret to a minimum number of employees lest its keys be revoked by the controlling agency, and yet the Y programmers have to be able to test and debug the program. Conceivably some computer scientists may eventually come up with some way of minimizing the risk of leakage of the keys while allowing the Y programmers to do their job, and the company must post rules that prevent coworkers from revealing certain information to each other lest they should figure out the keys by piecing together their information. Such design together with the rules also have to take into account the facts that there are different versions of the Y product, that employees come and go, and that employees may engage in reverse-engineering activities, for totally legitimate reasons, to be described in point (C). Technical difficulty aside, it fosters an atmosphere of distrust. The product manager in particular has to be very weary of his subordinate Y-programmers. And consider the problem of the central controling agency: How can a set of keys be "secretly" shared by many Y-producing companies world wide, unless writing Y programs becomes a privilege of a small club whose sharing of keys is closely monitored by the DRM central controling agency.
The second unrealistic assumption (B) has to do with the non-club members -- the conscientious programmers or whichever competent programmers for some reason cannot be admitted into the privileged Y programming club. DRM implicitly assumes that no one can possibly write (and publish on the internet) an equivalent of Y that bypasses the order from X and hence invalidates its dictatorship, by reverse engineering -- a way of painstakingly studying and figuring out the behaviors of existing X's and Y's without being given the keys. Again one is justifiably curious as to which successful engineering efforts in history have been dangerously founded on such unrealistic presumptions.
Finally, the third assumption (C) is that no one in the world is capable of figuring out any one of the "secret" keys and publishing it on the internet so that the non-club Y-programmers and plug into their programs and have them work just like the Y-programs produced by the club members. Similar to (B), it takes the ability to do reverse engineering, but unlike (B), it typically does not require programming ability.
It takes either (B), or (A) and (C) being true for DRM to be workable at all, not even worrying about how its misfeatures would drive away its potential users. DRM is touted as being able to protect Steve's intellectual properties in our story, while the big price (of document longevity) to pay is omitted in its propaganda. It inevitably builds an atmosphere of distrust among coworkers. There is also the question of free culture being killed by DRM as raised by Lawrence Lessig. [4] But these important questions aside, how can Steve trust that DRM be really effective at protecting his document at all if he ever learns these three unrealistic assumptions that are absent from the DRM advocates' propaganda?
Trusted Computing
The proposed solution to (B) is Trusted Computing. [5] In simple terms, it is the combination of an operating system and a hardware architecture that guards the user against the execution of "unauthorized" application programs. The hidden question is: "unauthorized" by whom? Again the technical details are irrelevant. The key point is that the operating system, with the help of the hardware, will decide for Melinda which applications are allowed to execute and which others are not. Melinda will participate in the process of authorizing application programs, which is especially appealing to her considering the large number of malicious Trojan horses and viruses on Windows. That much is advertised by the TC propaganda. You can finally trust your computer not to run inadvertently triggered programs, because each execution will be checked against the huge listing of malware database, dutifully maintained by the OS manufacturer.
The problem is that Melinda is not the only decision maker. In fact she does not have the final authority. The database maintainer, or equivalently, the operating system manufacturer, gets the final say to decide which programs are unauthorized. In our DRM example, the Y-equivalent programs written by non-club members are definitely in the black list whether Melinda agrees or not. "This has to be enforced for good reasons" one would say, if one supports DRM. See assumption (B) about how its "protection against piracy" would fall apart otherwise. "Sure, every user should sacrifice a small piece of freedom for the sake of the globally smooth operation of DRM. That's fair", some would say.
Except that the protection is still imperfect even if he were willing to sacrifice his "small" piece of freedom in exchange for his control over Melinda. What if some users refuse altogether to use TC-enabled operating systems?
Say Larry chooses to use the GNU/Linux operating system, which does not decide for users which programs can be executed, and which leaves the decision and responsibilities to the users. Larry also uses open source Y-equivalent programs written by non-club members. Now if either (A) or (C) fails to hold, Larry can then completely disregard the constraints that X places on DD and can play it an infinite number of times or convert it to an open file format better suitable for long term archiving. Melinda may not use GNU/Linux herself, but she could ask Larry to convert DD into an open file format for her.
So what does the word "trust" describe in this system? Do Steve and Melinda trust their computers, the OS manufactor, or the DRM controling agency for enforcing DRM? This use of "trust" would be at best unconvincing for Steve considering the alternative operating system scenario. It would be completely ironic for Melinda, whose rights are invaded and whose role Steve may play in other circumstances. In fact Steve for sure will play Melinda's role in way more other circumstances than ones in which he plays the DD author even if Steve is a very productive writer, painter, composer, singer, player, film clip producer and editor all at the same time. For inevitably Steve is by far more a digital document consumer than a digital document producer, even if we assume that he holds the copyright of all of his incredibly large collection of creative works and hence somehow has some influence in the (politically, not only technically) complex DRM ecosystem.
Do the OS manufacture or the DRM controling agency trust Steve or Melinda? This mechanism is designed specifically because the they do not trust that Steve and Melinda would abide by the copyright law.
Well there is one place where the word "trust" fits. The OS manufacture can now trust that Steve's and Melinda's computers work for it to spy on and regulate the copy-related activities of Steve and Melinda. That, is the true meaning of Trusted Computing.
Up to now we are assuming that the OS manufacture maintains a high moral ground and refrains from instructing the whole world of his Trusted Computer troop to further engage in other privacy- and human-right-invasion activities marginally or remotely related to the great cause of policing copyright infringement. If this assumption is false, which is very likely considering the complete lack of any correcting power of the public over the dominant OS manufacture in our present society, then the word "trust" is more synonymous with "betrayal".
An Aside: The Rise of the Big Brother
Most interestingly and intriguingly, do the DRM controling agency and the OS manufacture trust each other?
The Advanced Access Content System (AACS) [6] in a way combines the concepts of DRM and Trusted Computing to limit copying and other operations on the next generation storage media, HD DVD and Blu-ray Discs. In its complicated system of restrictions management [7], there are three Revocation Lists: HRL, DRL, and CRL, each for revoking compromised hosts (the equivalent of the Y program), drives (the equivalent of the OS plus hardware), and contents (the equivalent of DD), respectively. Theoretically, it takes a technique called traitor tracing [7] (no kidding, it's an academic term) to identify which hosts are compromised. Yet these lists could be readily used by the Y-producing companies, the OS manufacturer, as well as the media industry, to hurt each other if for some reason they decide to add entries which blacklist their rival companies and/or industries into these lists without really obtaining evidence of the victim software/device/media.
Do these industries have motivations to do this? Quite a lot. Are they really doing anything against each other yet? My guess is no because this mechanism is not well accepted yet and because right now they are busy fixing up problems caused by the failed assumptions (B) and (C). Yet it can be very instructive and revealing for social scientists to investigate the tension between the media companies, a few ICT companies, and in particular Microsoft, the dominant operating system manufacture, for the competition over the final control over both the TC and DRM mechanism.
Eventually, the one company that controls both mechanisms will become the Big Brother of the twenty first century, even overpowering the mightest governments in the present world, because the operating system it provides is indispensible in everyday government function.
DMCA, Tower of Babel, and Freedom of Speech
Back to our discussion of the broken mechanism, none of assumptions (A), (B), and (C) have been shown to work. In fact we will see several instances of (B) and (C) failing.
To remedy this, the Big Brother has to make it illegal for non-club members to write Y-equivalent programs. This is the purpose of the Anti-Circumvention Provision of DMCA. [8] Let's leave the details to the lawyers and just explain the main idea in plain English: Translation from the (DRM or otherwise) secret file format into another file format is illegal. Writing Y-equivalent programs without the permission of the DRM controlling agency is illegal.
This is reminiscent of the Bible story of the Tower of Babel, where human beings fail to achieve their engineering attempts because of the inability to communicate.
So the LORD scattered them abroad from there over the face of the whole earth; and they stopped building the city. Therefore its name was called Babel, because there the LORD confused the language of the whole earth; and from there the LORD scattered them abroad over the face of the whole earth. "Tower of Babel" Genesis 11:1-9
Except that there is no need for an almighty God to confound our languages. Our very own laws forbid translation, without which communication becomes impossible. With laws like DMCA, we ourselves are defeating our own ICT engineering attempts of modern day Tower of Babel. One certainly cannot blame God for our failure if the general public allows laws like DMCA to pass in other countries.
Seriously, let's think about the insanity of this law from an academic point of view. The DRM people try to hide the working details of their program by scrambling their code and the data they produce. This strategy of "security by obscurity" has been largely looked down upon as bad design, or secondary security measure at best. That's fine if they don't mind creating bad solutions for their own problems. Why are the rest of the world forced not to study their bad design and publish its weakness? In fact such law is directly challenging the way cryptographic researches are done, where researchers proposing new encryption algorithms are required to publish the details of their algorithms for other researchers to study and break. [9]
The implication is in fact much deeper than the freedom of research for academicians alone. The freedom of speech of the general public is under attack when it becomes illegal even to spread the solutions to and analysis of encryption algorithms published by academicians or programmers. The freedom advocates have made a point about this in the infamous case of Motion Picture Association of America's (MPAA) legal attack on the DeCSS algorithm. DVD's region code is a primitive form of DRM. When Linux users found that they were denied access to their legally obtained DVD movies, some decided to write their own player (the Y-equivalent program). It has to contain a portion of code (by the name DeCSS) that unscrambles the DVD content in order to play the DVD. Obviously DeCSS could also be useful for making illegal copies of a DVD. The only non-anonymous author was a Norwegian programmer Jon Johansen. Probably being unable to sue Johansen directly, MPAA started suing, based on DMCA, internet sites that distribute DeCSS. This caused great anger among freedom advocates and they have come up with creative ways of distributing DeCSS in various forms, including singing the source code, encoding it as a prime number, and so forth. The question being asked was: "If distributing the source code with the consent of its author is a crime, then is singing also a crime? Is showing a certain prime number to people also a crime?" David Touretzky, a professor at Carnegie Mellon University has even assembled a web page of DeCSS gallery, showing creative ways of rephrasing this question. [10] One can also find an MPAA's legal threat and Touretzky's reply there.
Such attack on the freedom of speech has become ridiculously direct in the "09 F9" incident. The supposedly secret key for encrypting HD DVDs and Blu-ray Discs was somehow cracked open and published on the internet. In May 2007, MPAA and AACS LA (AACS Licensing Administrator) demanded websites to remove that key, a mere 32 digit hexadecimal number that begins with 09 F9. This infuriated the netizens. People started creating pictures, songs, speeches, and articles that embed this number as a demonstration against the attack on the speech of freedom by MPAA and AACS LA. [11]
Does it hence become legal to distribute this number? No. And yet it is impossible for any government or organization to stop its spread. This, like it did exactly the same in the DeCSS case, signifies the huge conflict between the draconian DMCA law and the recognition of what's morally right and wrong of the general public. When the law is against the opinion of the general public and are unenforcable, which one is in the wrong? The law, or the public?
"If I ordered a general to fly from one flower to another like a butterfly, or to write a tragic drama, or to change himself into a sea bird, and if the general did not carry out the order that he had received, which one of us would be in the wrong?" the king demanded. "The general, or myself?" -- Antoine de Saint-Exup矇ry, "The Little Prince"
Conclusions
The big medias and the ICT corporations began their quest with an attempt to prevent illegal copying. Their technical solution, besides invading on consumer privacy and rights, is impractical. It also fosters distrust among coworkers and among the companies themselves, which they seem to happily ignore. They then resort to lobbying for draconian laws. This makes things worse, because now it's the laws that directly infringe on freedom of speech of the general public, whether they are experts involved in these technologies, consumers, or simply anybody making posts on the internet. As demonstrated in quite a few DMCA cases, It takes public awareness to fight these bad laws. If the public stays silent, the result will be a destopian world described in the book 1984, where citizens are constantly monitored, and speeches are regulated. We hope that the academicians will take the responsibility to make such information more readily available to the public to prevent the worst case scenario from happening.
References
- Digital rights management. (2007). In Wikipedia, The Free Encyclopedia. Retrieved August 14, 2007, from http://en.wikipedia.org/wiki/Digital_rights_management
- Hung, Chao-Kuei and Shen, Yen-Liang, "Social Considerations in Choosing Digital Archiving Technologies", in International Conference on Digital Archive Technologies. 2005. Available http://people.ofset.org/~ckhung/dl/p/05a.pdf
- arnezami, "Understanding AACS (including Subset-Difference)", Retrieved August 14, 2007, from http://forum.doom9.org/showthread.php?t=122363
- Lawrence Lessig. Free Culture - How Big Media Uses Technology and Law to Lock Down Cultures and Control Creativity, 2004. Available http://www.free-culture.cc/
- Trusted Computing. (2007). In Wikipedia, The Free Encyclopedia. Retrieved August 14, 2007, from http://en.wikipedia.org/wiki/Trusted_computing
- Advanced Access Content System. (2007). In Wikipedia, The Free Encyclopedia. Retrieved August 14, 2007, from http://en.wikipedia.org/wiki/Advanced_Access_Content_System
- FoxDisc, "Clarification on the state of AACS", Retrieved August 14, 2007, from http://forum.doom9.org/showthread.php?t=124505
- Electronic Frontier Foundation. Digital Millennium Copyright Act (DMCA). Retrieved August 14, 2007 from: http://www.eff.org/IP/DMCA/
- Youll, Jim, "RE: Arrest of Dimitri Sklyarov", Retrieved July 30, 2007, from http://anti-dmca.org/letter-aap.html
- Touretzky, D. S. (2000) Gallery of CSS Descramblers. Retrieved August 12, 2007, from http://www.cs.cmu.edu/~dst/DeCSS/Gallery
- AACS encryption key controversy. (2007). In Wikipedia, The Free Encyclopedia. Retrieved August 14, 2007, from http://en.wikipedia.org/wiki/AACS_encryption_key_controversy
- Most updated version of this page: http://frdm.cyut.edu.tw/~ckhung/a/c077.en.php; the version you're reading: June 29 2016 10:26:12.
- Author: Chao-Kuei Hung at Chaoyang University Information Management Department
- Save our Earth; please reduce printing, make use of the unprinted side, and recycle.
- You are welcome to distribute this document in accordance with the Creative Commons Attribution-ShareAlike License or the Free Document License.