All our Eggs in One Cloud: The International Risk to Private Data and National Security, A Study of United States’ data protection law using the International Communications Union Legislative Toolkit
All our Eggs in One Cloud:
The International Risk to Private Data and National Security, A Study of United States’ data protection law using the International Communications Union Legislative Toolkit
Two technological trends have been heading for an international collision in the 21st century. High speed internet has transformed cloud computing from the realm of science fiction to the commonplace in the lives of
consumers. The rising prominence of cloud computing has offered corporate actors the benefit of a cheaper and less cumbersome alternative to maintaining and upgrading their own servers.
As individuals and businesses rely less on personal storage to maintain personal data, the majority of digitized information is shifting from the hands of individuals to large conglomerates. All of the data “eggs” are being placed into a very small number of baskets.
At the same time, states are investing heavily in the development of defensive and offensive “cybertechnology.” States such as China and Russia have demonstrated a keen interest in developing virtual weapons capable of penetrating and retrieving protected information. The rising threats of international cyber terrorism and corporate theft demand both an international and domestic response. The United Nations has invested in the creation of a body, the International Telecommunication Union (“ITU”), which it hopes will coordinate international efforts. Based in Geneva, Switzerland, the ITU includes 192 member nations. The incompatibility between American and European privacy standards underpins the necessity for a coordinated international effort to regulate and protect private data.
Cloud computing giants, such as Google, now contemplate abandoning the confines of national boundaries to move their operations into international waters. This proposal illustrates the international nature of cloud computing security concerns. When nations place their technological wealth in third party servers, what role should those nations play in regulating and policing the information industry? I intend to provide a glimpse into the international landscape of cloud-computing and data transfers in the context of the history of technological theft. In piecing together the system and safeguards as they now stand, I also hope to uncover a guide to the international legal hurdles of cloud computing.
This note will first examine the technological background of cloud computing and its apparent benefits and inherent dangers. Following the technological background is a brief comparative examination of data regulation in the United States and the European Union to provide context for the discussion of the ITU’s toolkit as an international set of model rules. The analysis will focus on the model toolkit and its strengths and shortcomings primarily through comparison to the laws of the United States.
I. Technological Background
- a. Cloud Computing
The National Institute of Standards and Technology (“NIST”) defines cloud computing as “a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released . . . .” There are five essential characteristics of cloud computing: on demand self-service, broad network access, resource pooling, , rapid elasticity, and measured service.
- 1.The Five Essential Characteristics
On demand self-service is defined as the ability to “provision computing capabilities, such as server time and network storage, as needed automatically.” In other words, the service is available to the consumer when needed. It is this convenience of usage that makes cloud computing so appealing to a wide consumer base versus the inconvenience of sorting through and manually downloading data from remote sources.
Broad network access requires that the “capabilities are network accessible by various client platforms.” Client platforms are the devices used to access the network and include everything from PCs to smartphones.
Resource pooling is the heart of the both the efficiency and the danger of cloud computing: “The Provider’s computing capabilities are pooled to serve multiple consumers using a multi-tenant model. Resources are dynamically assigned and
reassigned according to consumer demand. In most cases the customer generally has no control or knowledge over the exact location of the provided resources.” (emphasis added). One of the essential characteristics of cloud computing is a lack of control, or even knowledge, of the location of “resources” used.
Rapid Elasticity refers to a lack of clear limitations on capabilities. “The consumer capabilities available for provisioning often appear to be unlimited and can be purchased in any quantity at any time.” The purpose of cloud computing is to offer a massive computing resource to many clients. The analogy of service should be to a utility company. The perception on the end of consumers is that the water and electricity we receive through our utility companies are unlimited (though they are, of course, finite).
The final essential characteristic is Measured Service. Again using the utility analogy, the services can be optimized by “leveraging a metering capability.” In other words, consumers can pay for computing the same way they pay for water or electricity: by measuring use.
- 2. The Three Service Models
Cloud computing currently operates under three “service model” categories:
“The first, Software-as-a-Service (“SaaS”), provides software to users. The user does not manage or control the actual physical computer networks belonging to the provider. The second, Platform-as-a-Service (“PaaS”) gives users limited control over the software so long as it does not interfere with the physical infrastructure of the provider’s network. The third, Infrastructure-as-a-Service (“IaaS”), provides users control over limited parts of the cloud infrastructure.”
The important distinction is simply one of control, from minimal control under SaaS to maximum control under IaaS. Some definitions of cloud computing are restricted to only SaaS. This is because SaaS is the dominant form of cloud computing and most consumers use cloud services through SaaS. It is therefore the form of cloud computing with which this article is most concerned.
SaaS products are web-based and (usually) web-distributed software. Examples of SaaS products include (but are nowhere near limited to): “document management, payroll, human resources, tax systems, accounting, timekeeping, e-mail spam filtering, litigation support, data hosting, and office productivity.” A business that uses a suite of these programs will have nearly all of its data, communications, and personal information pass through a remote destination and computing hub.
- 3. The Four Deployment Models
The four deployment models are differentiated primarily based on who the services are used by. The four models are the Private Cloud model, the Community Cloud model, the Public Cloud model, and the Hybrid Cloud model. The names themselves suggest the differences: a private cloud is restricted to a single organization and may even be maintained by that organization; a Community Cloud is shared by several consumers with “shared concerns”;  public clouds are open to either the public at large or a large industry; and finally, hybrid clouds are combination of two or three of the other models.
As private and community cloud models are inherently proprietary, the public cloud model is the focus of this paper. However, knowledge of the other types of models is relevant as they do have important roles to play in articulating
concerns regarding cloud computing.
- 4. The Industry
The cloud computing industry has attracted some of the most competitive and cutting edge corporations in the world. Amazon, Microsoft, Google and others have entered the field to make sure they are not left behind. Industry experts expect cloud computing based revenues to reach anywhere from $40 to $160 billion by 2012.
The appeal of using cloud computing services to businesses and individuals at large is enormous. $800 billion a year is spent on purchasing and maintaining enterprise software, 80% of which is spent on the installation and maintenance
of software. That 80%, or $640 billion a year, is where cloud computing comes in. Instead of paying the high costs of maintenance, doled out in electric bills for servers and keeping in-house IT employees, the cloud provider centralizes all those functions. A business (like a law firm) can then focus on its industry (legal services) instead of also having to act as a network administrator.
- b. The Dangers of Cloud Computing
- 1. General Security Concerns
Cloud computing has been called “by default, insecure” because it has several security concerns that stem from its basic structure. There are several security concerns which stem from the basic structure of cloud computing. . The data is all kept in a single place, giving potential hackers a one-stop shop for valuable information, and creating a highly incentivized market that encourages greater sophistication and organization among computer hackers. The necessity of sending information back and forth from remote servers gives potential data thieves additional opportunities to retrieve the information. The lack of encryption found in many cloud based data transfers exacerbates that problem and creates large security loopholes that benefit hackers and data thieves.
These concerns take on a greater dimension when applied to larger corporations. The real targets for cloud computing are those businesses that make up the bulk of the $800 billion enterprise software market.
Security secrets and even the legal system could be at risk as large law firms and private government contractors move to cloud computing. The United States Government itself has announces an initiative to move towards cloud computing. The Government’s initiative is itself illustrative of the benefits and the dangers of cloud computing; although the Government will save a great deal of its $76 billion IT budget, centralizing the government’s information also makes it easier to target.
- Cyber warfare and Espionage in the Digital Age
International cyber security has been increasingly threatened not from just independent actors, such as hackers, but from state-sponsored organizations. Certain states have been particularly involved in the web-security battle.
Chinese backed hackers are now at the forefront of cyber security and have launched attacks against both public and private targets. The growing virtual battle for information between private corporations, western governments, and China has even been referred to as a “Cold Cyberwar.” In December, a China-based attack on Google was discovered. Google believes the attack was designed to gain access to the email accounts of Chinese human rights activists.
China has been involved in cyber-attacks against the United States government as well. The variety of attacks on traditional targets of espionage, such as military contractors, have expanded to include corporate secrets. The
Cyber-War is a total war in that foreign enemies and competitors are not delineating between governments and private actors.
- b. Russia
Russia has also been suspected of being involved in highly aggressive cyber attacks. Unlike their Chinese counterparts, however, these attacks have destabilized and undermined neighboring states. Russia has also been implicated in international data theft from both governments and industries. Russia’s highly effective use of cyber attacks to not just steal information, but also to disable foreign governments is especially troubling. It demonstrates that even organizations like states, with large pools of resources, can suffer serious consequences when their data and technology are functionally centralized. It is vital to point out that, at least in the case of Estonia, the targets for cyber attacks have not been unsophisticated. Russia is also no stranger to cyber attacks, and was in fact the victim of one of the world’s first cyber attacks in 1982.
II. Comparative Legal Background
a. The United States
The United States has several pieces of legislation which impact cloud computing. Individual states have also taken up the call with their own cloud computing laws, but they fall beyond the national, international, and supranational focuses of this note. The Obama administration is also currently developing a national cybersecurity strategy that will rely heavily on coordination with the private sector.
1. Stored Communications Act
The Stored Communications Act was first passed in 1986 and has been criticized for its lack of cohesion and clarity. It does, however, provide the backbone of American regulation of internet and digital service providers. The Act created two differentiated subsections for regulation and understanding the law’s implications for cloud computing requires analysis to “begin by classifying the data in question within one of these two categories.” Those two categories are Electronic Communication Services (“ECS”) and Remote Computing Services (“RCS”).
i. Electronic Communication Services
The Act itself defines an ECS as “”any service which provides to users thereof the ability to send or receive wire or electronic communications.” The definition of ECS was written particularly with e-mail and direct person-to-person communication in mind. To qualify as an ECS, a provider must satisfy two requirements:
First, the service provider must offer users “the ability to send or receive . . . electronic communications.” Electronic communications is broadly defined to mean nearly any form or style of communication, including “signs, signals,
writings, images, sounds, data or intelligence of any nature.” Second, the service provider must hold the electronic communication in “electronic storage.
If a provider meets the above standards, then it falls under the Act’s privacy requirements. However, the definition of “electronic storage” in the act is archaic and based on the technology of the time.
ii. Remote Computing Services
The Act defines a Remote Computing Service as “”the provision to the public of computer storage or processing services by means of an electronic communications system.” The language defining what constitutes an electronic communications system is as archaic as the explanatory language defining the components of an ECS. The more frustrating attribute of the Act is its very limited language, which was tailored very specifically to be relevant to the commercial models of the computer market of the 1980s. Its specific language has made it difficult to apply the meaning of the act to technology as it has developed and Congress has failed to appropriately drag the technology into the 21st century.
- b. Europe
The primary law in the European Union protecting and regulating the transfer of data is EU Directive 95/46/EC. European Union member states were required to adopt the directive in 1998, making EU Directive 95/46/EC more than a decade more recent than the Stored Communications Act. The Directive is aimed mostly at personal data, but is instructive in certain ways. The requirements placed on data transfers are especially important. Data cannot be transferred to a third party outside of the EU unless the “third country in question provides an adequate level of protection.” The European Union does not provide a clear definition of what is considered “adequate protection,” but a framework of requiring businesses that collect and hold data to take certain measures to protect that data is present.
The European Union also passed the eprivacy directive, or Directive 2002/58/EC, in 2002. The Directive regulates the use of data by private actors in a number of ways, including requiring the unambiguous consent of the subject of data for the processing of personal data. In addition to setting standards for how private data is managed by private actors, The Directive also seeks to protect private actors from data theft by examining the use of spyware and internet “cookies” to retrieve information from private computers clandestinely.
- a. United Nations and other Supranational Involvement
The primary body under the U.N. concerned with cybersecurity (other than arguably the Security Council) is the International Telecommunications Union (“ITU”). The ITU articulates its relationship with cybersecurity in its mission
“A key priority lies in bridging the so called Digital Divide by building information and communication infrastructure, promoting adequate capacity building and developing confidence in the use of cyberspace through enhanced online security. Achieving cybersecurity and cyberpeace are amongst the most critical concerns of the information age, and ITU is taking concrete measures through its landmark Global Cybersecurity Agenda.”
The ITU provides a legislative toolkit for cybercrime legislation. This note will focus primarily on the sections relevant to cloud computing. The Legislative toolkit will act as a basic touchstone by which to compare American and European privacy and data protection law. The toolkit will provide a useful “optimal policy” position to test current law against and find where it is adequate and where is falls short.
III. The ITU’s Legislative Toolkit: How far away are Europe and the U.S.?
a. Section 2: Unauthorized Access to Computers, Computer Systems, and Networks.
Section 2 of the Toolkit creates four subsections of types of unauthorized access to computers and networks. Those categories are 1) a general category with the same title as the Section title, 2) a category for unauthorized access
to government computers, 3) unauthorized access to “critical” infrastructure, and 4) unauthorized access for the purpose of terrorism. The legislative toolkit curiously provides no guidelines for the size of penalties. The toolkit leaves blanks for fines and/or imprisonment for all four categories and provides no suggested ranges. This means anything from a $10 fine to a 100-year prison sentence could be inserted as the legislatively prescribed penalty for an unauthorized access of computers and networks. Clearly this total lack of suggested penalties means that the real purpose of the suggested legislation is to provide language and definitions of terms. Therefore, that is where the analysis should focus.
Most important to Section 2 is the Toolkit’s definition of “access”:
Access means to make use of; to gain entry to; to view, display, instruct, or communicate with; to store data in or retrieve data from; to copy, move, add, change, or remove data; or otherwise make use of, configure, or reconfigure
any resources of a computer program, computer, computer system, network, or their accessories or components, whether in whole or in part, including the logical, arithmetical, memory, transmission, data storage, processor, or memory
functions of a computer, computer system, or network, whether by physical, virtual, direct, or indirect means or by electronic, magnetic, audio, optical, or other means.
Most notable is the definition’s breadth. Gaining entry to data is alone enough to qualify as access. A prosecutor would not necessarily have to prove that information was viewed or manipulated as an element of the crime. Unlocking the door is enough, whether or not the defendant walks through it.
The law of the United States is not as explicitly expansive. The Stored Communications Act does not define the term “access” in its list of definitions, but uses the word. The comparable passage from the Act is:
(a) Offense. Except as provided in subsection (c) of this section whoever—
(1) intentionally accesses without authorization a facility through which an electronic communication service is provided; or
(2) intentionally exceeds an authorization to access that facility;
and thereby obtains, alters, prevents authorized access to a wire or electronic communication while it is electronic storage in such system shall be punished as provided in ubsection (b) of this section.
The toolkit makes a distinction for government computers. Without suggested penalties, the most important note to make of the ITU’s suggested legislation is that it includes any private computers that are “used on behalf of the government.” The scope of that language is ambiguous. A private contractor working on a research project for a government contract could be included, and a cloud computing service like Google that has taken on the role of server provider for a publicly-funded university could potentially fall within the scope of this language as well.
The United States laws protecting government computers and networks are more robust than those that protect private data and fall outside of the scope of this note. The relevance is the potential that laws that mirror the toolkit could have to give private computers and networks the same protections given to government bodies.
The protections provides a subsection for “Unauthorized Access to Critical infrastructure.” The definition of “critical infrastructure” provided in the toolkit is as follows:
Critical infrastructure means the computers, computer systems, and/or networks, whether physical or virtual, and/or the computer programs, computer data, content data and/or traffic data so vital to this country that the incapacity or destruction of or interference with such systems and assets would have a debilitating impact on security, national or economic security, national public health and safety, or any combination of those matters.
The purpose of carving out separate language for critical infrastructure would logically be to allow for more severe penalties for damaging critical infrastructure. The toolkit provides a broad definition of critical infrastructure that does not articulate precisely which industries, when disrupted, would have a “debilitating impact on security, national or economic security, national public health and safety, or any combination of those matters.” The inclusion of “economic security” seems to imply that any private concern with considerable importance in the state that adopts the language of the toolkit will receive stronger protection than smaller players in the industry. In the pre-September 11th United States, “critical infrastructure” was limited to eight specific industries. Under the auspices of the Department of Homeland Security’s Critical Infrastructure Program, however, the term has become far less concisely defined.
The final subsection of Section 2 is in regards to terrorism. It is easy to imagine a combination of terrorism provisions and the critical infrastructure provision. Indeed, it is difficult to imagine any act of terrorism that would not fall under the critical infrastructure provision.
b. Section 3. Unauthorized Access to or Acquisition
of Computer Data, Content Data, Traffic Data
Section 3 of the toolkit is nearly identical to Section 2, except that it focuses on data instead of hardware (computers and networks). The most notable difference is the inclusion of a fifth category in addition to the four found in Section 2. The fifth category is one protecting financial institutions from data theft and cyber threats. The offenses listed in Section 3 cover not just the access of data belonging to financial institutions, but a variety of “illegal acts” as well which include “facilitating, advancing, assisting, conspiring, or committing extortion, identity theft, or any other illegal act not covered by provisions within this Law, whether or not via a computer program, computer, computer system, or network, a criminal offense shall have been committed.” It is interesting to see that the ITU and ABA included a comprehensive law for protecting financial institutions within its cybercrime toolkit. In fact, the specific language “whether or not via a computer program, computer, computer system, or network” seems to be a statement from the drafters of the toolkit on policy which is explicitly beyond the scope of the purpose of the toolkit.
The United States’ criminal justice system penalizes this wide array of crimes in different statutory frameworks. The Hobbs Act provides the basis of extortion law. Identity theft is largely left up to individual states to craft statutes. The ABA and ITU should be commended on their desire to create a simple and comprehensive model code system, but in this case it seems that the inclusion of penalties for “illegal acts” (which one would presume would necessarily already be punishable to be considered “illegal”) is superfluous and likely to be covered in most legal systems.
The interesting element of the financial data category is that the category limits itself to financial data “of a financial institution.” Unauthorized access to financial data would have to fall under another category if that financial data was not the data of a financial institution.
The area of financial data is one that the United States most desperately needs to address. Comparatively, financial data in the United States is protected and regulated irrespective of the nature of the business that holds it.
a. Section 4
Interference and Disruption Section 4 of the toolkit focuses on interference and disruption of computer systems and networks. Section 4 is very similar to Section 2, except the language of access has been changed to the more malicious crime of purposely damaging a computer system or network. Also added to Section 4 is a distinction between the crime of disrupting a “computer program, computer data, content data, traffic data” and disrupting the above with “knowledge of or intent to cause serious harm or threaten public safety.”
Providing a criminal punishment for the intentional disruption is not surprising, since it is an act of destruction of property, even if that property is virtual and the destruction is temporary. While the separate
categories initially seem to imply that the distinction is a separate criminal punishment for disruption and disruption with intent, the language of intent is provided in the body of the subsections, if not in the titles.
The biggest difference is the “serious harm” language. The extension of harm to include harm to “life, limb, or property” suffers from a lack of proper explanation. More confusing is the separate language used for threatening public safety. It is difficult to imagine a situation where disrupting data would be likely to cause harm to life or limb without endangering public safety. Without commentary explaining how close the connection has to be between the intentional disruption of data and the consequential damage to life or limb it is unclear whether intent itself is sufficient mens rea regardless of the proximity between the act and the harm. The toolkit’s failure to delve into these actus reus and mens rea requirements makes it difficult to compare the limits and similarities between the toolkit’s recommended legislation and American law.
Section 4, like the previous sections, includes a provision for terrorism. The toolkit does not take it upon itself to define a term as difficult as terrorism, but there are some conclusions that can be drawn from the ITU’s opinion on how to deal with cyberterrorism. The most striking inclusion is that the ITU has left fines on the table for punishing terrorism. Applying fines to deal with terrorist activities not only demonstrates a clear departure from more combative strategies of fighting terrorism, but is actually fairly shocking. While the fines may be intended to be used in tandem with prison sentences, there is little doubt that U.S. penalties for terrorism are more draconian.
Without even a basic definition of terrorism, it is difficult to see how a terrorism statute would differ from a punishment for intentionally endangering public health and safety. The New York antiterrorism statute defines an act of terrorism as an act “intended to (i) intimidate or coerce a civilian population; (ii) influence the policy of a unit of government by intimidation or coercion; or (iii) affect the conduct of a unit of government by murder, assassination or kidnapping.” The United States Code defines “international terrorism” as activities that (A) involve violent acts or acts dangerous to human life that are a violation of the criminal laws of the United States or of any State, or that would be a criminal violation if committed within the jurisdiction of the United States or of any State;” and
(B) appear to be intended—
(i) to intimidate or coerce a civilian population;
(ii) to influence the policy of a government by intimidation or coercion; or
(iii) to affect the conduct of a government by mass destruction, assassination, or kidnapping
The primary difference between intentionally endangering the public safety and health and committing terrorism is what the intent is. If the intent is to influence government or intimidate the people, then an act is terrorism, but if the public health is intentionally endangered for purposeless malice without a political motive, the punishment falls under subsection (c) instead of (f). Presumably a terrorism category would exist to provide harsher punishments for acts of cyberterrorism above and beyond regular criminal acts. Without a clear explanation of how an act of data disruption or destruction would be distinguished as an act of terrorism from a malicious act, the toolkit’s criminal categorization is inadequate.
b. Section 11. Corporate Liability
The ITU expands liability for the criminal acts from Sections 1-10 to “legal persons,” or corporate entities. The toolkit assigns the same liability given to individuals to corporations in a few situations. The liability is applied if the act is conducted by someone in a leadership position at the “corporation, association, or other legal entity” under their authority to represent the legal person or on the authority vested in them on behalf of the legal person and if the offense benefitted the legal entity. This provision is clearly intended to prevent an organization from escaping liability by focusing the liability on the specific individual who conducted the act.
Corporate Criminal liability has long existed in the United States at common law and would extend any provisions adopted from or similar to Sections 1-10 of the toolkit without need to provide for a corporate liability code section.
In addition to the expansion of liability based on leadership, the toolkit includes liability for corporate actors based on the principles of negligence and agency. These legal principles are already very well developed in the United States and like the other piece of Section 11, would not need to be incorporated into American law.
c. Section 22. International Cooperation: General Principles
Section 22 is notable for its objective of promoting international cooperation among nations and supranational organizations. The United States is a member of many treaties articulating particulars of cooperating with other states in criminal proceedings, but lacks a single comprehensive approach to doing so in the field of cybercrime. The ITU is one of a set of disparate organizations attempting to create a greater degree of cooperation. Codifying a willingness to work with other states to cooperate on cybercrime hardly seems necessary for the United States, however, as it remains a major force for international cooperation.
IV. Conclusion: A Lack of Substance and Depth Harms the Toolkit’s Functionality The toolkit fails as a set of model rules in key areas. Most importantly, it lacks definitions for key terms or implies definitions that are contrary to common definitions in place. The toolkit may be useful for developing nations who have no legislation in place to protect data and punish criminals. But even in that case, the lack of sophistication and subtlety is what those legislatures most need, not broad rules which could only be interpreted and implemented by a sophisticated legal system. Even by U.S. standard the rules are weak and provide little guidance.
The greatest fault of the toolkit is the lack of any guidance on sentencing. While the need to be broad to be applicable in different legal cultures is important, the lack of any guidance is terribly disappointing. Ranges on severity of
punishment would go a long way. Allowing any range of penalty, from a small fine to life in prison, the rules fail miserably in providing any sense to the purpose of the categorization of crimes. A state adopting the toolkit
could always choose to alter the penalties on its own anyway. The lack of penalty ranges creates a problematic lack of uniformity among states who adopt the toolkit, which is one of the major impetuses behind creating the bill in the first place. If one state adopts small fines and another long prison sentences as the penalties for data theft or disruption, then the result of interstate crime will likely create acrimonious results.
The codification of the principles of international cooperation is encouraging, but lacks a great deal of substance. The toolkit should have provided a barebones recommended process for cooperating on this inherently international issue. The section on international cooperation seems rushed and tacked onto the end, which is unfortunate. Any nation willing to import legislative code from an international body has already demonstrated a willingness to cooperate with supranational actors in an attempt to forge uniformity of law with other nations. They do not need to reaffirm that cooperation, but concrete and uniform model solutions to the problem.
The toolkit also fails to deal with the regulation of privacy on the part of private actors. That is, while the toolkit provides for liability of corporate actors if their employees commit the crimes outlined in the toolkit, the toolkit fails to provide guidance on negligent handling of data which allows easier access to data thieves.
This plaguing lack of sophistication and depth to tackle a hugely complicated and difficult issue has left the toolkit woefully inadequate. It might be a start if a state has nothing on books regarding cybercrime, but otherwise it seems to be a willing sidesteps from the responsibility of providing model rules. The ABA’s work here is a far cry from its model rules. The United States does not have the most comprehensive or sophisticated set of data legislation, but even a comparison to American law raises serious questions about the adequacy of the toolkit.
*J.D. Candidate 2012, University of Minnesota; M.A., George
Mason University, 2008; B.A., George Washington University, 2006.
 See David
Narkiewicz, Legal Tech Forecast: Cloudy With Only a Chance of Purchasing New
Software, 32 Pa. Law. 56 (2010) (“Instead of hosting
expensive servers, networking devices and data backup devices, purchasing
software and renewal licenses, and paying for an information technology (IT)
staff, cloud computing moves the servers and backup off site and the law firm
pays only for the use of the servers and software applications . . . .”).
 See Natasha Solce, The
Battlefield of Cyberspace: The Inevitable New Military Branch —The Cyber Force,
18 Alb. L.J. Sci. & Tech
293, 297-300 (2008) (detailing the rising use of cyber attacks by more than
half a dozen different states); see also
White Paper on Critical Infrastructure Protection, PDD-63, Section VI, Annex A
(May 22, 1998), available at
http://www.justice.gov/criminal/cybercrime/white_pr.htm (an order to increase
investment in cyber defense).
 See, e.g., Tony Halpin, Putin Accused of
Launching Cyber War, The Times (London), May 18, 2007,
(discussing Russia’s use of cyber warfare); Dep’t of Def., Annual Report to Congress: Military Power of the People’s Republic of
China 13-14 (2007) available at
http://www.defenselink.mil/pubs/pdfs/070523-China-Military-Power-final.pdf (regarding China’s cyber warfare program and
 See Francis Lyall, International
Communications: The International Telecommunication Union and the Universal
Postal Union 127 (2011) (“The modern ITU remains the
sole international agency through which international electrical
communications, whether wireless or wired, are regulated by agreement of its
members, which include all members of the UN.”).
 Id. at 149.
 Ashlee Vance, Google’s
Search Goes Out to Sea, N.Y. Times Bits Blog (Sep. 7, 2008, 9:59 PM),
(describing the details of Google’s patent application for a “water based data
 The National Institute of Standards and Technology is a
government agency within the United States Department of Commerce. See National
Institute of Standards and Technology (NIST) General Information, http://www.nist.gov/public_affairs/general_information.cfm (last visited Oct. 16, 2010).
 Nat’l Inst. Of Standards & Tech., U.S. Dep’t of
Commerce, The NIST Cloud Computing Project, http://csrc.nist.gov/nice/states/maryland/posters/cloud-computing.pdf
(last visited Oct. 16, 2010) [hereinafter NIST
 Janine A. Bowen, Overview
of Cloud Computing, in Cloud
Computing 2010: Is Your Company Ready? 37, 41 (Peter Brown & Leonard T. Nuara, eds., 2010).
 See id.
 Id. at 41– 42.
 Id. at 42.
 See Peter Rogers
& Susan Leal, Running Out of Water: The Looming Crisis and Solutions to
Conserve our Most Precious Resource 123
(2010) (explaining that both energy and water are finite resources).
 The NIST
Project, supra note 8.
 Bowen, supra
note 10, at 42.
 Shahid Khan, “Apps.Gov”:
Assessing Security in the Cloud Computing Era, 11 N.C. J.L.
& Tech. On. 259, 265 (2010).
 Id. at 265–266.
 See, e.g., Diane Murley, Technology for Everyone . . .: Law Libraries in the Cloud, 101 Law
Libr. J. 249, 250 (2009) (explaining that PC
Magazine’s Tech Encyclopedia defines cloud computing as limited to SaaS).
 Khan, supra note20, at 265 n.24.
 Murley, supra note
22 at 251.
 Bowen, supra
note 10, at 42.
 See id.
 See, e.g., Mark Radcliffe, Analyzing Cloud Contracts and Deal Structure, in Cloud Computing 2010: Is Your Company Ready? 99, 102 (Peter Brown & Leonard T. Nuara,
eds., 2010) (providing the details
of Microsoft and Amazon’s cloud computing plans and contracts); see also William
J. Robison, Free at What Cost?: Cloud
Computing Privacy Under the Stored Communications Act, 98 Geo. L.J.
1195 (2010) (analyzing
Google’s ad-based approach to cloud computing).
 Christopher Soghoian, Caught
in the Cloud: Privacy, Encryption, and Government Back Doors in the Web 2.0 Era,
8 J. on Telecomm. & High Tech. L.
359, 361 (2010) (incorporating assessments by Merrill Lynch and the Wall Street
 Radcliffe, supra note
 See id.(claiming that data centers use 100
times per square foot the amount of resources that normal office space
 Murley, supra note
22, at 251.
 Soghoian, supra note
32, at 372 (arguing that many or most cloud computing servers
leave data particularly vulnerable to hackers).
 See Timothy D.
Martin, Hey! You! Get off of My Cloud:
Defining and Protecting the Metes and Bounds of Privacy, Security, and Property
in Cloud Computing, 92 J. Pat. & Trademark Off. Soc’y 283, 296–97 (2010) (“Because a cloud provider might store data
from many users–especially corporate users implementing a PaaS or IaaS
structure–on the same physical hardware, there is a risk of isolation
failure. That is, a provider could
inadvertently or intentionally commingle data from multiple clients (a
“guest-hopping” attack). There is also a risk that a provider’s
employees (malicious insiders) could damage a client’s data or infrastructure. Cloud computing opens new points of entry for malicious
attacks as well.”).
 See Andrew
Devore, Cloud Computing: Privacy Storm on
the Horizon?, 20 Alb. L.J. Sci. & Tech.
365, 369 (2010) (arguing that hackers are increasingly organized,
sophisticated, professional, and methodical).
 See id. (“it’s
also difficult to encrypt data in the cloud”); see also Robison, supra note
31Error! Bookmark not defined., at 1239 n.219 (explaining that a lack of either the
ability or willingness to provide or support encryption may allow information
stored in cloud computing to be retrievable by unauthorized actors).
 See Radcliffe, supra note 31, at 101 (estimating the market size for
enterprise software at $800 billion); But see generally SaaS Market Size, SaaS Business Blog, (Sept. 27, 2010, 9:47 PM)
market size for enterprise software at a much lower $90 billion).
 See Daniel
Terdiman, White House Unveils Cloud
Computing Initiative, CNET (Sep. 15, 2009, 12:24 PM),
http://news.cnet.com/8301-13772_3-10353479-52.html (reporting that the Obama
administration’s move towards cloud computing is an attempt to increase
efficiency and reduce energy costs).
 See Matthew
Hoisington, Cyberwarfare and the Use of
Force Giving Rise to the Right of Self-Defense, 32 B.C.
Int’l & Comp. L. Rev. 439, 443 (2009) (detailing the
Pentagon’s assertions that hackers affiliated with the People’s Liberation Army
of China were involved in hacking into Department of Defense computers); But see Eric Talbot Jensen, Cyber Warfare and Precautions Against the
Effects of Attacks, 88 Tex. L. Rev.
1533, 1538 (2010) (“The attacks on Google and others also highlight another
significant problem that plagues cybersecurity, or at least responses to cyber
invasions – the inability to attribute cyber attacks. Attribution is the ability to know who is
actually conducting the attacks.”).
 See DK Matai, China’s Cold Cyberwar: Rise of 5th
Dimension Red Army and Economic Pearl Harbor?, Asymetric
Threats Contingency Alliance and mi2g (Jan. 18, 2010, 7:10 PM),
 See id. (DK Matai, founder of mi2g, argues
that China’s attacks on corporations and governments over the past years
represent a turning point in geopolitical security concerns). But see,
John Leyden, Why is mi2g so
Unpopular?, The Register (Nov.
21, 2002, 9:18 PM)
(claiming that mi2g overstates the risks of cyber attack and plays the role of
a modern day “chicken little”).
 See Jeanne Meserve and Mike M. Ahlers, Google Reports China-Based Attack, Says Pullout Possible, CNN (Jan. 12, 2010)
 See id.
 See Josh Rogin, The Top 10 Chinese Cyber Attacks (That We
Know About), Foreign Policy: The Cable (Jan. 22, 2010, 8:57 PM),
(listing the “top” cyber attacks, including attacks on the Commerce Department,
the State Department, and Congressional Offices).
 See Siobhan
Gorman, August Cole & Yochi Dreazen, Computer
Spies Breach Fighter-Jet Project, Wall St. J. (Apr. 21, 2009)
http://online.wsj.com/article/SB124027491029837401.html (reporting that the
military’s costliest weapons program of all time was compromised by computer
 See Jensen, supra note 43, at 1536 (discussing the
China-based attack on Google, wherein Google’s search code was potentially
 See Chloe Arnold,
Russian Groups Claims Reopen Debate on
Estonian Cyberattacks, Radio Free Europe Radio Liberty
(Mar. 30, 2009),
 See id.
(reporting that while Russia denies any involvement in the attacks, they came
after a hotly divisive diplomatic dispute between Estonia and Russia over a
Soviet-era monument. Here a former
Kremlin official claimed that his office was involved in the attacks, with
mixed responses regarding the credulity of the claim); see also John Markoff, Before
the Gunfire, Cyberattacks, N. Y. Times (Aug.
12, 2008) http://www.nytimes.com/2008/08/13/technology/13cyber.html (describing
a series of cyber attacks against the state of Georgia leading up to the
Russian invasion in 2008).
 See Tim Shipman, China and Russia have launched ‘many’ cyber
attacks on British industry and state, warn MPs, The Daily
Mail, Mar. 12, 2010,
(reporting the British House of Commons Intelligence and Security Committee
found a systemic threat from Russian and Chinese cyber attacks on British
government and industry); Alex Spillius, Russian
Hackers penetrate Pentagon Computer system in cyber attack, The Daily Telegraph, Nov. 30, 2008,
(reporting on the Pentagon’s belief that a malware attack was based out of
 See Martin, supra note 37, at 298–99.
 Indeed, a cyber attack against a technologically
unsophisticated enemy would serve little purpose as they would have little
digital information to disrupt or destroy. See
Rex Hughes, Bits, Bytes, and Bullets,
63 The World Today 20, 20–22 (2007) (detailing a cyber
attack, most likely perpetrated by Russia, on Estonia’s complex governmental
and communications infrastructures).
 See Kim Zetter, Future of Cyber Security: What are the Rules of Engagement?, Wired, Jul. 28 2009, http://www.wired.com/dualperspectives/article/news/2009/07/dp_security_ars0728 (explaining the CIA planted a “logic bomb” in technology used to help
operate a Siberian gas pipeline, resulting in the buildup of pressure and
destruction of the pipeline).
 See Yasuhide
Yamada, Atsuhiro Yamagishi, & Ben Katsumi, National Leadership, Individual Responsibility: A Comparative Study of
the Information Security Policies of Japan and the United States, 4 J. Nat’l Sec.
L. & Pol’y 217, 220 (2010) (contemplating the
larger role states play in the United States in regulating the private sector
and that impact on a lack of federal regulation of information security).
 Amy E. Bivins, Administration
Officials Say Private Sector Will Play Central Role in Cybersecurity Efforts,
15 Elec. Comm. & L. Rep. 1709, 1709
 See Orin Kerr, Reshaping the Framework: A User’s Guide to the Stored Communications
Act, and a Legislator’s Guide to Amending It, 72 Geo. Wash. L. Rev. 1208, 1208 (2004) (arguing that the dense and confusing language of the act has left typical
interpreters of the law, scholars and judges, unable and unwilling to tackle
 Robison, supra note
31, at 1205.
 18 U.S.C. § 2510(15) (2000).
 Robison, supra note
31, at 1205.
Id. at 1206 (describing the Act). See also 18 U.S.C. §§ 2702(a)(1), 2703(a), (2006) 2510(15)
 Robison, supra note
31, at 1206 (“This requirement is commonly
misunderstood because the statutory definition of ‘electronic storage’ is much
narrower than its name suggests. The Act limits ‘electronic storage’ to mean
(1) ‘temporary, intermediate storage . . . incidental to the electronic
transmission of the communication and (2) copies made by the service provider
for ‘backup protection.’ This rather odd definition is better understood in
light of the e-mail delivery system in place at the time, which required
multiple service providers to store communications briefly before forwarding
them on to their next destination or while awaiting download by the recipient”). See 18 U.S.C. § 2510(17).
 An “electronic communications system” is defined as
“”any wire, radio, electromagnetic, photooptical or photoelectronic
facilities for the transmission of electronic communications, and any computer
facilities or related electronic equipment for the electronic storage of such
communications.” 18 U.S.C. §§ 2510 (14) (2000), 2711(2) (2000 & Supp I 2001).
 See supra, note 66.
 Robison, supra note
31 at 1207.
 See Maeve Z.
Miller, Note, Why Europe is Safe from
Choicepoint: Preventing Commercialized Identity Theft Through Strong Data
Protection and Privacy Laws, 39
Geo. Wash. Int’l L. Rev. 395, 405 (2007).
 See Alexander
Zinser, International Data Transfer out
of the European Union: The Adequate Level of Data Protection According to
Article 25 of the European Data Protection Directive, 21 J. Marshall J. Computer & Info. L. 547, 548 (2003).
 Parliament and Council
Directive 95/46/EC, art. 25(1) 1995 O.J. (L 281) 45 ( “The Member States shall provide that the transfer to a
third country of personal data which are undergoing processing or are intended
for processing after transfer may take place only if, without prejudice to
compliance with the national provisions adopted pursuant to the other provisions
of this Directive, the third country in question ensures an adequate level of
 See Zinser, supra note 71, at 549.
 See Council Directive 2002/58/EC, 2002 O.J. (L 201) 37 [hereinafter
 Daniel Garrie & Rebecca Wong, Spyware Technologies: Limiting the Horizons of Digital Privacy, 23 T.M. Cooley
L. Rev. 473, 486 (2006).
 Id. at 486-88.
 The ITU mission: bringing the benefits of ICT
to all the world’s inhabitants, International Telecommunications Union, http://www.itu.int/net/about/mission.aspx (last visited Oct. 17, 2010).
 ITU Toolkit For
Cybercrime Legislation, American Bar Association Privacy &
Computer Crime Committee, (Jody R. Westby ed., Draft Rev. Feb. 2010) http://www.itu.int/ITU-D/cyb/cybersecurity/docs/itu-toolkit-cybercrime-legislation.pdf [hereinafter Toolkit].
 The sections analyzed in this note include the following:
Section 2. Unauthorized Access to Computers, Computer Systems,
and Networks; Section 3. Unauthorized Access to or Acquisition of Computer
Data, Content Data, Traffic Data; Section 4. Interference and Disruption; Section
11. Corporate Liability; Section 22. International Cooperation: General
Principles. Toolkit, supra note 78.
 Id. at 13-14.
at 11 (emphasis added).
 The use of “or” throughout the toolkit’s definition of
“access” could mean that a prosecutor would only have to satisfy one of the
elements of the definition in order to show that a defendant “accessed” data. See id.
18 U.S.C. § 2711 (2006) (providing definitions for the Stored
Communications Act which do not include the term “access,” and citing the
definitions given in 18 U.S.C. § 2510 (2006), also not defining the term
“access”). See also 18 U.S.C. § 2701 (using the term “access” but not defining
 18 USC § 2701.
 See Toolkit, supra note 78, at 13 (treating
government computers separately from private computers).
 See, e.g., Google Apps for the University of Minnesota,
Regents of the University of Minnesota,
http://www.oit.umn.edu/google/ (last visited Oct. 31, 2010) (describing the
partnership between Google and the University of Minnesota which provides
university students, faculty, and staff access to a specific suite of Google’s
cloud-based software applications). Whether Google is using a computer “on
behalf of the Government” through this partnership is ambiguous.
 For a discussion of the laws providing protection to
government information, computers, and networks, particularly in the context of
national security see generally David
B. McGinty, The Statutory and Executive
Development of the National Security Exemption to Disclosure Under the Freedom
of Information Act: Past and Future, 32 N. Ky. L. Rev. 67 (2005).
 See Toolkit, supra note 78, at 14. The full text of this
subsection is as follows:
Whoever commits unauthorized access pursuant to
paragraph (a) of this Section to a computer, computer system and/or connected
system, or network that is exclusively for the use of critical infrastructure
operations, or in the case which such is not exclusively for the use of
critical infrastructure operations but the computer, computer system and/or
connected system, or network is used for critical infrastructure operations and
such conduct is intended to affect that use or impact the operations of
critical infrastructure, shall have committed a criminal offense punishable by
a fine of [amount]_______ and/or imprisonment for a period of ________.
 Id at 12.
 See, e.g., Richard Downing, Thinking Through Sentencing in Computer Hacking Cases: Did the US
Sentencing Committee Get it Right?, 76 Miss. L.J. 923, 942-46 (2007) (examining the
heightened sentencing guidelines in hacking cases with a focus on an analysis
of economic harm as a basis for sentencing, and arguing that in the case of
critical infrastructure the harm is greater, therefore the sentencing should be
 See Toolkit, supra note 78, at 12 (defining the term “critical infrastructure”).
 Critical Foundations:
Protecting America’s Infrastructures, President’s Comm’n on Critical
Infrastructure Protection 3-4 (Oct. 1997) (naming the eight
categories as transportation, oil and gas production, water supply, emergency
services, government services, banking and finance, electrical power, and
See National Infrastructure Protection
Plan: Partnering to enhance protection and resiliency, United
States Department of Homeland Security (2009) (outlining an extensive plan for coordinating
the protection of critical infrastructure and key resources (CIKR)). This
report bases its authority, and definitions of CIKR, on Homeland Security
Presidential Directive 7. Homeland Security Presidential Directive 7, Critical
Infrastructure Identification, Prioritization, and Protection (Dec. 17, 2003), available at http://www.dhs.gov/xabout/laws/gc_1214597989952.shtm#1
(using the definitions of critical infrastructure and key resources from 42
U.S.C. § 5195c(e) and 6 U.S.C. 101(10) respectively).
 See Toolkit, supra note 78, at 14. The full text of this subsection is as
Whoever commits unauthorized access pursuant to
paragraph (a) of this Section and such conduct is with the intention of
developing, formulating, planning, facilitating, assisting, informing,
conspiring, or committing acts of terrorism, not limited to acts of cyberterrorism,
shall have committed a criminal offense punishable by a fine of [amount]_______
and imprisonment for a period of ___________.
Toolkit, supra note 78, at 13-14 (containing the
prohibitions against unauthorized access of computers, appearing in Section 2
of the suggested legislation of the toolkit)
with id. at 14-15 (containing the prohibitions against unauthorized access
of data, appearing in Section 3 of the suggested legislation of the toolkit,
and adding a prohibition against the unauthorized access of financial
Toolkit, supra note 78, at 14-15 (outlining five
categories of prohibitions for the unauthorized access of data).
id. (outlining the additional category in (d) of Section 3 of the suggested legislation in the toolkit).
 Id. at 15.
 See id. (suggesting
acts committed without the use of a computer and associated paraphernalia could
trigger the prohibitions of this suggested legislation).
 Hobbs Act, 18 U.S.C. 1951(a) (2006) (penalizing “robbery or
extortion” that interferes with commerce).
 Michael Stephan et al., Note, Identity Burglary, 13 Tex.
Rev. Law & Pol. 401, 403-04 (2009) (comparing the inadequacies of
state identity theft statutes).
 Toolkit, supra note 78, at 15.
 See generally Michael
L. Rustard & Thomas H. Koenig, Cybersecurity
Policy: Extending Learned Hand’s Negligence Formula to Information Security
Breaches 3 IJLSP 237 (2007)
(finding that no state statute forces financial institutions that suffer
security breaches to notify the victims of the breach, and suggesting Learned
Hand’s risk/utility test be extended to this area).
 See generally Gramm-Leach-Bliley
Act, 15 U.S.C. § 6801-6809 (2006) (obligating financial institutions to
establish safeguards for the protection of the financial data of their
 See Toolkit, supra note 78, at 15–16.
 Compare Section 2
in Toolkit, supra note 78, at 13–14, with
Section 4 at 15–16.
Id. at 15.
 Id. at 15–16.
(a) Interference or Disruption of Computers,
Computer Systems, Networks
Whoever, without authorization or in excess of
authorization or by infringement of security measures, intentionally causes
interference or disruption of a computer, computer system and/or connected systems,
or networks shall have committed a criminal offense punishable by a fine of
[amount]_______ and/or imprisonment for a period of __________.
(b) Interference or Disruption of Computer
Program, Computer Data, Content Data, Traffic Data
Whoever, without authorization or in excess of
authorization or by infringement of security measures, intentionally causes
interference or disruption of a computer program, computer data, content data,
or traffic data shall have committed a criminal offense punishable by a fine of
[amount]_________ and/or imprisonment for a period of _____________.
(c) Interference or Disruption With Knowledge of
or Intent to Cause Serious Harm or Threaten Public Safety
Whoever commits interference or disruption
pursuant to paragraphs (a) or (b) of this Section with the intent to cause or with
knowledge that such conduct could cause serious harm to life, limb, or property
or threaten public health and/or safety, shall have committed a criminal offense
punishable by a fine of [amount]________ and/or imprisonment for a period of
(d) Knowledge of or Intent to Cause Interference
or Disruption of Government Computers, Systems, Networks, Data
Whoever commits interference or disruption
pursuant to paragraphs (a) or (b) of this Section with the intent to cause or
with knowledge that such conduct could cause interference and/or disruption of
computers, computer systems and/or connected systems, networks, computer programs,
computer data, content data, or traffic data used by the Government in
furtherance of the administration of justice, national security, or national
defense shall have committed a criminal offense punishable by a fine of
[amount]_______ and imprisonment for a
period of ______________.
(e) Knowledge of or Intent to Cause Interference
or Disruption of Critical Infrastructure
Whoever commits interference or disruption
pursuant to paragraphs (a) and (b) of this Section with the intent to cause or
with knowledge that such conduct could cause interference and/or disruption of
the computers, computer systems and/or connected systems, networks, computer
programs, computer data, content data, or traffic data used by critical
infrastructure, shall have committed a criminal offense punishable by a fine of
[amount]_________ and imprisonment for a period of ________________.
(f) Intent to Cause Interference or Disruption
for Purposes of Terrorism
Whoever commits interference or disruption
pursuant to paragraphs (a) and (b) of this Section with the intent of developing,
formulating, planning, facilitating, assisting, informing, conspiring, or committing
acts of terrorism, not limited to acts of cyberterrorism, shall have committed
a criminal offense punishable by a fine of [amount]_______ and imprisonment for
a period of ______________.
 See Sanford Kadish, Stephen Schulhofer & Carol
Steiker, Criminal Law and its Processes 213 (Vicki Been et al. eds., 8th ed. 2007) (“[A]lmost
invariably, the concern of the criminal law is limited to determining whether a
defendant intended, expected, or should have expected his actions to product
 Toolkit, supra note 78, at 16.
 See Louis Jim,
Note, “Over-Kill”: The Ramifications of
Applying New York’s Anti-Terrorism Statute Too Broadly, 60 Syracuse
L. Rev. 639, 657 (2010) (examining the New York’s anti-terrorism statute
and the severe punishments, including capital punishment; there is no use of the
word “fine”); accord 18 U.S.C.
§ 1992 (including the use of fines as a possible alternative to incarceration,
but with much stronger sentencing guidelines and a differentiation between
terrorist attacks with and without fatalities). But see Hiram Chodosh, Reflections
on Reform: Considering Legal Foundations for Peace and Prosperity in the Middle
East, 31 Case W. Res. J. Int’l L.
427, 451 (1999) (discussing the benefits of using fines to “reduce delays and
the public cost incurred by the prison system” as part of a suggested criminal
procedure reform to better handle the demands placed on it by terrorist acts); see also Amanda Silverman, Note, Draconian or Just? Adopting the Italian Model of Imposing
Administrative Fines on the Purchasers of Counterfeit Goods, 17 Cardozo J.
Int’l & Comp. L. 175, 189,196 (2009) (finding that
using fines to discourage illegal activities that support terrorism have been a
useful tool in diminishing the financial resources of terrorist organizations).
 N.Y. Penal Law§
490.05 (Consol. 2008).
 18 U.S.C. § 2331 (2006).
note 78, at 16.
 Id. at 18.
 See id.
 See New York
Cent. & H.R.R. Co. v. U.S., 212 U.S. 481 (1909) (holding by the Supreme
Court that a corporation can be held criminally liable for the conduct of
employees who break the law for the benefit of the corporate actor).
 See Toolkit, supra note 78, at 18–19.
 Restatement (Second)
of Torts § 324A (1965) (“Liability to Third Person for Negligent
Performance of Undertaking”).
 See Toolkit, supra note 78, at 23–24.
 Salil K. Mehra, Law
and Cybercrime in the United States Today, 58 Am. J. Comp. L. 659, 679 (2010) (“Consensus on a framework
for international cooperation in cybersecurity is
lacking, with the ratification of the Council of Europe’s Convention on
Cybercrime as an exception. There is no single holistic office for evaluating
international cybersecurity policy or threats. Instead, a collection of
agencies and departments coordinate and participate in international fora.”).
 Id. at 680.