Your continued donations keep Wikipedia running!    

Trusted computing

From Wikipedia, the free encyclopedia

Jump to: navigation, search
The neutrality of this article is disputed.
Please help by reporting disputed passages and terms on the talk page.

Trusted Computing (TC) refers to a technology developed and promoted by the Trusted Computing Group (TCG). The term is taken from the field of trusted systems and has a specialized meaning. In this technical sense, "trusted" does not necessarily mean the same as "trustworthy" from a user's perspective. Rather, it means that it can be trusted more fully to follow its intended programming with a lower possibility of inappropriate activities occurring that are forbidden by its designers and other software writers.

Trusted Computing is not without controversy. Advocates of the technology claim that it will make computers safer, less prone to viruses and malware, and thus more reliable from an end-user perspective. Further, they state that Trusted Computing will allow computers and servers to offer improved computer security over that which is currently available. Opponents believe that trust in the underlying companies is not deserved and that the technology puts too much power and control into the hands of those who design systems and software. They also believe that it potentially forces consumers to lose anonymity in their online interactions, as well as mandating technologies that many have no pressing need for. Finally, TC is seen as a possible enabler for future versions of document and copy protection - which are of value to corporate and other users in many markets.

Most computer security experts distrust TC[citation needed], as they believe it will provide computer manufacturers and software authors with increased control to impose restrictions on what users are able to do with their computers. There are concerns that TC would have (or may even covertly be intended to have) a large anti-competitive effect on the free software markets, private software development, and the IT market in general. Some, such as Richard Stallman, have suggested the backronym treacherous computing for these reasons[1].

Regardless of the debate and the form of the final products, major influences in computing, such as chip manufacturers Intel and AMD, and systems software developers such as Microsoft, plan to include TC into coming generations of products.

Contents

Synopsis

The basic system concepts in Trusted Computing are:

  1. A machine-specific public key and certificate chain.
  2. Cryptographic functionality implemented in hardware.
  3. Data can be signed with the machine's identification.
  4. Data can be encrypted with the machine's secret key.

The nature of trust

Unlike the common definition of 'trust', security experts define a trusted system to be one which has to be trusted for the security of a larger system to hold. For example, the United States Department of Defense's definition of a trusted system is one that can break your security policy; i.e., "a system that you are forced to trust because you have no choice." Cryptographer Bruce Schneier observes "A 'trusted' computer does not mean a computer that is trustworthy." Using these definitions a hard drive controller must be trusted by its users, that it genuinely saves to the drive in every case, the data it is intended to be saving, and a secure website must be trusted that it is secure because a user cannot verify this for themselves. Trust in security parlance is always a kind of compromise or weakness—sometimes inevitable, but never desirable as such. As another analogy, your best friend cannot share your medical records, since he or she does not have them. On the other hand, your doctor does, and can (legal issues with doing so aside). It is possible that you trust your doctor and think he or she is a fine person; it's also possible that there is only one doctor in your town, so you are forced to trust him or her.

The main controversy around trusted computing is around this meaning of trust. The Trusted Computing group describes "Technical Trust" as "an entity can be trusted if it always behaves in the expected manner for the intended purpose." Critics characterize a trusted system as a system you are forced to trust rather than one which is particularly trustworthy.

There is also concern amongst critics that it will not always be possible to examine the hardware components on which Trusted Computing relies, the Trusted Platform Module, which is by necessity where the core 'root' of trust in the platform has to lie. If not implemented correctly, it presents a security risk to overall platform integrity and protected data. The specifications, as published by the Trusted Computing Group, are open and are available for anyone to review. However, the final implementations by commercial vendors will not necessarily be subjected to the same review process.

A final concern is that the world of cryptography can often move quickly, and that hardware implemenations of algorithms might create an inadvertant obsolence.

While proponents claim that trusted computing increases security, critics counter that not only will security not be helped, but trusted computing will facilitate mandatory digital rights management (DRM), harm privacy, and impose other restrictions on users. Trusting networked computers to controlling authorities rather than to individuals may create digital imprimaturs. Contrast trusted computing with secure computing in which anonymity, not disclosure, is the main concern. Advocates of secure computing argue that the additional security can be achieved without relinquishing control of computers from users to superusers.

Proponents of trusted computing argue that privacy complaints have been addressed in the existing specifications - possibly as a result of criticism of early versions of the specifications. There is an amount of end-user choice in the way in which their Trusted Platform Module can be used; however it is suspected that third-parties might mandate the use of various options, thus undoing any benefits of user choice.

Key Concepts

Trusted computing encompasses four key technology concepts, of which all are required for a fully trusted system.

  1. Secure Input and Output
  2. Memory curtaining / Protected execution
  3. Sealed storage
  4. Remote attestation

Secure I/O

Secure input and output (I/O) refers to a protected path between the computer user and the software with which they believe they are interacting. On current computer systems there are many ways for malicious software to intercept data as it travels between a user and a software process - for example keyboard loggers and screen-scrapers. Secure I/O reflects a hardware and software protected and verified channel, using checksums to verify that the software used to do the I/O has not been tampered with. Malicious software injecting itself in this path could be identified.

Although protecting against software attacks, Secure I/O doesn't assist in protection against hardware-based attack such as a key capture device physically inserted between the user's keyboard and the computer.

Memory curtaining

Memory curtaining extends the current memory protection technques to provide full isolation of sensitive areas of memory - for example locations containing cryptographic keys. Even the operating system doesn't have full access to curtained memory, so the information would be secure from an intruder who took control of the OS.

Sealed storage

Sealed storage protects private information by allowing it to be encrypted using a key derived from the software and hardware being used. This means the data can be read only by the same combination of software and hardware. For example, users who keep a private diary on their computer do not want other programs or other computers to be able to read it. Currently, a virus can search for the diary, read it, and send it to someone else. The Sircam virus did something similar to this. Even if the diary were protected by a password, the virus might run a dictionary attack. Alternately the virus might modify the user's diary software to have it leak the text once he unlocked his or her diary. Using sealed storage, the diary is securely encrypted so that only the unmodified diary program on his or her computer can read it.

Remote attestation

Remote attestation allows changes to the user's computer to be detected by him and others. That way, he can avoid having private information sent to or important commands sent from a compromised or insecure computer. It works by having the hardware generate a certificate stating what software is currently running. The user can present this certificate to a remote party to show that their computer hasn't been tampered with.

Remote attestation is usually combined with public-key encryption so that the information sent can only be read by the programs that presented and requested the attestation, and not by an eavesdropper.

To take the diary example again, the user's diary software could send the diary to other machines, but only if they could attest that they were running a secure copy of the diary software. Combined with the other technologies, this provides a more secured path for the diary: secure I/O protects it as it is entered on the keyboard and displayed on the screen, memory curtaining protects it as it is being worked on, sealed storage protects it when saved to the hard drive, and remote attestation protects it from unauthorized software even when it is used on other computers.

Criticism

Opponents of trusted computing point out that the security features that protect computers from viruses and attackers also restrict the actions of their owners. They argue that this makes new anti-competitive techniques possible, which may hurt the people who buy trusted computers.

The Cambridge cryptographer Ross Anderson has great concerns that "TC can support remote censorship [...] In general, digital objects created using TC systems remain under the control of their creators, rather than under the control of the person who owns the machine on which they happen to be stored (as at present) [...] So someone who writes a paper that a court decides is defamatory can be compelled to censor it — and the software company that wrote the word processor could be ordered to do the deletion if she refuses. Given such possibilities, we can expect TC to be used to suppress everything from pornography to writings that criticise political leaders." He goes on to state that:

"[...] software suppliers can make it much harder for you to switch to their competitors' products. At a simple level, Word could encrypt all your documents using keys that only Microsoft products have access to; this would mean that you could only read them using Microsoft products, not with any competing word processor."
"The [...] most important benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as OpenOffice). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices."

Anderson summarizes the case by saying "The fundamental issue is that whoever controls the TC infrastructure will acquire a huge amount of power. Having this single point of control is like making everyone use the same bank, or the same accountant, or the same lawyer. There are many ways in which this power could be abused."

Users can't change software

In the diary example, sealed storage protects the diary from malicious programs like viruses, but it doesn't distinguish between those and useful programs, like ones that might be used to convert the diary to a new format, or provide new methods for searching within the diary. A user who wanted to switch to a competing diary program might find that it would be impossible for that new program to read the old diary, as the information would be "locked in" to the old program. It could also make it impossible for the user to read or modify his or her diary except as specifically permitted by the diary software. If he or she were using diary software with no edit or delete option then it could be impossible to change or delete previous entries.

Remote attestation could cause other problems. Currently web sites can be visited using a number of web browsers, though certain websites may be formatted (intentionally or not) such that some browsers cannot decipher their code. Some browsers have found a way to get around that problem by emulating other browsers. For example, when Microsoft's MSN website briefly refused to serve pages to non-Microsoft browsers, users could access those sites by instructing their browsers to emulate a Microsoft browser. Remote attestation could make this kind of emulation irrelevant, as sites like MSN could demand a certificate stating the user was actually running an Internet Explorer browser.

Users don't control information they receive

One of the early motivations behind trusted computing was a desire by media and software corporations for stricter Digital Rights Management (DRM): technology to prevent users from freely sharing and using potentially copyrighted or private files without explicit permission. Microsoft has announced a DRM technology that it says will make use of trusted computing.

Trusted computing can be used for DRM. An example could be downloading a music file from a band: the band's record company could come up with rules for how the band's music can be used. For example, they might want the user to play the file only three times a day without paying additional money. Also, they could use remote attestation to only send their music to a music player that enforces their rules: sealed storage would prevent the user from opening the file with another player that did not enforce the restrictions. Memory curtaining would prevent the user from making an unrestricted copy of the file while it's playing, and secure output would prevent capturing what is sent to the sound system.

Once digital recordings are converted to analog signals, the (possibly degraded) signals could be recorded by conventional means, such as by connecting an audio recorder to the card instead of speakers, or by recording the speaker sounds with a microphone. Even trusted computing cannot defeat the analog hole.

Without remote attestation, this problem would not exist. The user could simply download the song with a player that did not enforce the DRM restrictions, or one that lets him convert the song to a normal "unrestricted" format such as MP3.

Users don't control their data

One commonly stated criticism of Trusted Computing, is that sealed storage could prevent them from moving sealed files to the new computer. This limitation might exist either through poor software design or deliberate limitations placed by content creators. The migration section of the TPM specification requires that it be impossible to move certain kinds of files except to a computer with the identical make and model of security chip. If an old model of chip is no longer produced it becomes impossible to move the data to a new machine at all; the data is forced to die along with the old computer.

Moreover, critics are concerned that TPM is technically capable of forcing spyware onto users, with e.g. music files only enabled on machines that attest to informing a artist or record company every time the song is played. In a similar vein, a news magazine could require that to download their news articles, a user's machine would need to attest to using a specific reader. The mandated reader software could then be programmed not to allow viewing of original news stories to which changes had been made on the magazine's website. Such "newest version" enforcement would allow the magazine to "rewrite history" by changing or deleting articles. Even if a user saved the original article on his or her computer, the software might refuse to view it once a change had been announced.

Loss of Internet Anonymity

Because a TC-equipped computer is able to uniquely attest to its own identity, it will be possible for vendors and others who possess the ability to use the attestation feature to zero-in on the identity of the user of TC-enabled software with a high degree of certainty.

Such a capability is contingent on the reasonable chance that the user at some time provides user-identifying information, whether voluntarily or indirectly. One common way that information can be obtained and linked is when a user registers a computer just after purchase. Another common way is when a user provides identifying information to the website of an affiliate of the vendor.

While proponents of TC point out that online purchases and credit transactions could potentially be more secure as a result of the remote attestation capability, this may cause the computer user to lose expectations of anonymity when using the Internet.

Critics point out that this could have a chilling effect on political free speech, the ability of journalists to use anonymous sources, whistleblowing, political blogging and other areas where the public needs protection from retaliation through anonymity.

In response to privacy concerns, researchers developed Direct anonymous attestation which allows a client to perform attestation while limiting the amount of identifying information that is provided to the verifier.

Proposed owner override for TC

All these problems come up because trusted computing protects programs against everything, even the owner. A simple solution is to let the owner of the computer override these protections. This is called owner override, and it is only currently outlined as a suggested fix.

Activating owner override would allow the computer to use the secure I/O path to make sure the owner is physically present, to then bypass restrictions. Such an override would allow remote attestation to a user's specification, e.g., to create certificates that say Internet Explorer is running, even if a different browser is used. Instead of preventing software change, remote attestation would indicate when the software has been changed without owner permission.

Some Trusted Computing Group members have viewed owner override as a potential danger to the TC program. Owner override, they believe, defeats the trust in other computers since remote attestation is not enforced centrally. Owner override offers the security and enforcement benefits to a machine owner, but does not prevent another owner from waiving rules or restrictions on her own computer. Under this scenario, once data is sent to someone else's computer, whether it be a diary, a DRM music file, or a joint project, that other person controls what security, if any, their computer will enforce on their copy of those data.

One of the fundamental premises behind trusted computing is that the owner cannot be trusted [citation needed]. It is assumed that the user will—through negligence or willful intent—attempt to compromise their own system. For example, an IT administrator could not ensure that notebook computers are running an specified operating system.

The question of practicality

It has also been compellingly argued that many of the assumptions which underly TC are impractical "in the real world," to the extent that many users will find it pragmatically necessary to employ Owner Overrides on a regular basis, or simply decline to use the features altogether ... even if this puts them at odds with software vendors who may wish to insist upon its use.

Hard drives and other hardware components (including, presumably, the TC hardware itself) do fail from time to time, or are simply upgraded and replaced. "And so, what then?" A user might rightfully conclude that the mere possibility of being irrevocably cut-off from access to his or her own information, or to years' worth of expensive work-products, with no opportunity for recovery of that information, is unacceptable. Legal restrictions on the use and dissemination of information, or mandating its reliable storage for a period of time that may extend to many years in the future, may also, it has been argued, preclude the practical application of TC technology in many of the ways now contemplated. The concept of basing ownership or usage restrictions upon the verifiable identity "of a particular piece of computing hardware" may be perceived by the consumer as inadequately answering the question, "what do I do when it breaks?"

Support

  • The Linux kernel has trusted computing support since version 2.6.13 and there are several projects to implement trusted computing for Linux. In January 2005, members of Gentoo Linux's "crypto herd" announced [2] their intention of providing support for TC - in particular support for the Trusted Platform Module. There is also a TCG compliant software stack for linux , TrouSerS. It is released under an open source license .
  • Some limited form of trusted computing can be implemented on current versions Microsoft Windows with third party software.
  • Windows Vista will enable the use of a Trusted Platform Module as a cryptographic provider if it is present in a system, allowing for features such as BitLocker full-drive encryption, and Secure Startup.[3]

External links

Personal tools
In other languages