Trusted Computing platform, DRM coming to hard drives

Share on facebook
Share on twitter
Share on linkedin
Share on whatsapp
Trusted Computing platform, DRM coming to hard drives

San Jose (CA) – As part of a series of announcements made this week at the annual RSA security conference, the Trusted Computing Group announced it will be publishing a specification that extends the reach of devices covered by the so-called Trusted Platform to the realm of the hard disk drive. With the advent of TCG’s Storage Work Group’s implementation of the Trusted Platform Module version 1.2, comes at least the technical possibility that a fixed, single industry standard could emerge for digital rights management at the storage device level.

Last October, the TCG announced its Mobile Work Group would be publishing a similar implementation of TPM for cell phones and handsets. This week’s announcement from the Storage group is similar in scope, but could have much farther reaching consequences. With hard disk drives not only the principal data storage devices for PCs, but also DV-Rs such as TiVo, new home media center computers, and a multitude of small handheld devices including iPod, the possibility exists for a new breed of storage devices with rights management built in. Such devices could, in effect, relieve operating systems and set-top box firmware of the responsibility for implementing DRM, at the very time when the consumer electronics industry remains at loggerheads over standards for adapting system firmware for evolving DRM specifications.

“In any content protection system, it’s that last little inch that’s always the sensitive area,” Michael Willett, senior director of research at Seagate Technologies, and co-chair of the TCG’s Storage Work Group, told TG Daily. Already, he argued, we know how to communicate content, such as streaming media, across a network from point to point, but those points have historically been processors. There’s a mechanism that exists in an HDD between the I/O processor and the read/write head, and unless that’s secured, that small distance becomes the weak link in the chain. “It’s that last inch, or half-inch, of movement and manipulation that’s always been the sensitive aspect of some of these control systems. Now that you’ve got the sensitive and secure computation right on the drive media itself, you maybe are closing that little half-inch.”

The objective of the Trusted Computing Platform is to specify at least one element of a computing system that cannot be changed by the outside world. In that element, TC architecture would embed a program that could generate authentication code that could be used to exclusively identify the system. This immutable code, to use the TC term, could then utilize the system identity as a key for encrypting all data communications between itself any any other device that identifies itself in the same manner. This way, no communication from the outside world could interfere with the interaction between two devices, and successfully report itself as one or the other device. All communication over such a channel would be trusted because the identity of its sources could always be ascertained and verified.

It is one of the most laudable ideas ever to emerge from the field of computing; but partly because of the way it has been implemented, and partly due to its own destined-to-be-Orwellian title, Trusted Computing has drawn considerable skepticism, much of it from very reputable sources. Much of the argument against TC architecture boils down to a notion that the Trusted channel of communication, due to its own impenetrability, creates a kind of back-alleyway within people’s own PCs where undetectable programs may lurk, placed there by any number of Powers That Be.

But this week’s development could actually change that worst-case landscape…perhaps, some would argue, for better or worse. Seagate’s Willett makes a very compelling case for distributing TPM 1.2 resources, with the possible effect of neutralizing some worst-case scenarios. Up to this point in the history of computer architecture, he said, the PC motherboard has been considered the most trustworthy device in the system, because it contains a degree of immutable hardware, and immutability is the essence of technological trust. But there’s another immutable element in the system, he points out, and that’s the class of product his company produces: “The drive has always had a full-blown processor,” Willett told TG Daily. “There’s a computer in there that has its own software, which is the firmware. It’s loaded at the factory, and traditionally, we don’t allow that to be changed in the field.” In other words, you can’t flash the ROMs of a hard drive through a network utility – at least not yet.

Furthermore, Willett argued, an HDD has its own internal memory, which cannot be addressed by the CPU of a PC. Its typical use, he rexplained, is for the drive to keep track of its own mapping, and the locations of its bad sectors. “So one of the cornerstones of the architecture we’ve done in the Storage Workgroup [is], we have partitioned that hidden memory in what are called security partitions. For each security partition, you can define a set of functions that are all part of the architecture, like cryptographic functions, storage functions, administrative functions, that you can bundle [together].” From there, an API can be utilized to make that bundled set of cryptographic functions addressable from an operating system, but only through a level of indirection – that is, utilizing the API as a go-between, so that it never breaches the immutable regions of the HDD’s firmware.

As a result of this, Willett proposes, a hard drive controller has every right to be considered a root of trust in the TC scheme, as the TPM module on the motherboard. “The two basic characteristics of the TPM, he said, “[are] the ability to do signing, and the ability to be non-changing. We’re mimicking those characteristics in the hard drive. We’ll have a root of trust [there], so certain parts of the hard drive will be immutable, non-changing, [including] certain aspects of the firmware…And the minimal implementable complement of security functions – like digital signing, random number generation, hashing, secure storage, those sorts of things – will be in the hard drive.”

Does Trusted Computing now provide security for the content providers or from them?

The architecture that Seagate’s Willett illustrated for us makes feasible the following scenario: A standard API for DRM functionality implemented across the board by hard drive manufacturers, could be exposed to drivers in the operating system. This API could then replace a broad range of redundant, competing DRM schemes currently implemented in software, especially those from iTunes, Windows Media, and licensed P2P services such as Peer Impact.

In that scenario, the computer – or, for that matter, the DV-R or media center – would contain two roots of trust, with functionality split between them, distributed to where they can be most effective. “The two roots of trust have the basic building blocks of authentication, identity, and communication,” Willett explained, “because you’re doing signing, challenge/response, seed generation, cryptography. So between the hardware root of trust in the drive and the hardware root of trust in the platform, you first would establish secure communications. So we define a secure messaging protocol between the two, and then you’re off and running. Once you have that level of authentication, then each command that’s issued has its own authentication.”

From the TPM’s perspective, it may be as if no split existed between the devices at all, or perhaps as though the motherboard or STB firmware and the hard drive were the only two devices in the universe. Already, Willett said, media center PC manufacturers are implementing TPM platforms in their motherboards. As a result, he predicted, “you’re going to see an interplay between the storage media – our storage devices – and those platforms. Whether they be set-top boxes, or TiVo, or full-blown media servers, they’re all going to have this roots-of-trust concept, and this dialog, this split functionality between roots of trust.”

The most logical place for content protection schemes to reside, Willett’s argument continues, becomes the hard drive. “In the area of content protection, licenses and content [become] stored on a hard drive. So when you look at a full-blown content protection and licensing mechanism, and you have the freedom to use computation in the drive and in the platform, there’s a very natural split. Some of the cryptography, or some of the license manipulation, could go directly onto the hard drive, and not have to keep going back over to the platform, back and forth.”

This could conceivably change not only where the DRM is placed, but who places it there, and who controls it after it’s there. Within this Trusted Platform could be the keys to the digital media kingdom; and now, even hard drive manufacturers are racing to be the ones to secure them. But to accomplish this, Willett knows, requires the cooperation of a body of historically non-cooperative parties: the content producers. “You understand how traditionally paranoid [they are] – from the content producers, to the distributors, to the content renderers – how the twain shall never meet, right? How vertically non-integrated they are?” Willett asked.

What would sway content producers to adopt HDD-driven DRM, argued Willett, is what has always swayed their opinion: cost reduction. Market forces will enter the picture, he said, along with increasing attempts by malicious users to destroy the chain of trust. “On the one hand, [content producers] are going to be looking for a stronger technology,” remarked Willett; “on the other, they’re going to be looking for cost savings. So content owners, distributors, box manufacturers, and renderers, the whole chain – are looking for cost savings and integration all the way along the whole lifecycle.”

All of a sudden, rather than TPM being the device that carries forth the ill will of the content empire, a technological evolution could actually serve to separate the two parties. Call it a “functionality split,” to coin a phrase, where the content providers may very possibly find themselves succumbing to the will of a simpler technological solution that, as both the providers and the manufacturers may discover, it may become too expensive for them not to implement.