San Francisco (CA) – The Core buzz presented during IDF blurred several other important announcements – including the long awaited release of the LaGrande security spec. The creator of the spec told TG Daily that La Grande won’t be hitting home computers in the near future – as it remains a developing platform that could cause privacy concerns.
If it wasn’t directly related to the Core Microarchitecture, it wasn’t a big deal at IDF this week. The UMPC, touted back at the Fall IDF in 2005 as a major milestone for mobile computers, got about two minutes in Sean Maloney’s keynote; LaGrande, yet another member of the “*T’s,” as Intel calls its platform technologies such as 64-bit extensions (EMT), virtualization (VT) and active management (AMT), was worth about two sentences in Pat Gelsinger’s keynote.
Perhaps LaGrande, or short LT, was kept fairly quiet since any security technology is not a sexy topic to discuss in public. But clearly, LT was undersold given its long development history and its implications for computer users.
LT surfaced first in 2002 and was part of the short-lived security initiatives of TCPA (later renamed to Trusted Computing Group or “TCG”) and Microsoft’s controversial “Palladium” technology, which eventually evolved into the Next Generation Secure Computing Base (or “NGSCB”), which has been dismissed at least once but appears to have become a part of Windows Vista now.
In its early stages, Intel’s LT could have been viewed as an approach to shift the control of how software and content is used on a personal PC from the user to the owners of that software or content. In the past five years, that message has changed.
In short, LT is designed to protect applications and data from software and simple hardware attacks, according to Intel’s David Grawrock, who is not only responsible for the development of LT, but is also the chairman of the TCG. He filled more than 250 pages of documentation that describes how LT works, but told us that the technology could be broken down to three layers in a simple model.
First there are LT-enabled processors and chipsets as well as the trusted platform module (TPM), which acts as “safe” for so called Attestation ID Keys (AIKs). Second, there is the Virtual Machine Monitor (VMM), the software-based keystone of LT, which acts as mediator between instances requesting and delivering data. The VMM has not only full control of processor resources, physical memory, interrupts and communication between two parties, it also allows software processes to only gain access when the appropriate process is executing in the CPU, explained Grawrock. Third, there is a layer independent of “partitions” with different sets of data: While a general user partition may set the rules for running an operating system, there can be several other protected partitions, which for example determine which antivirus-software is installed on a system and how it is used or updated. For example, a user may not know which or how many of such partitions are installed on a corporate LT PC, while the system administrator may not know how the user partition rules look like. Basically, a LT environment can be described as a system of exchanged trust.
So, how does LT work? A common example when LT comes into play is when a user requests access to a corporate network. In such a case, this request will cause a protected partition to ask the VMM to check, if a PC is “trusted.” The VMM then will the send a request to the TPM and check whether certain conditions for a network access are met; this involves the identification of the PC via AIKs but also the verifications of , for example the existence of certain configuration of anti-virus software. If these conditions are not met – or if a TPM is deactivated – the requested access to the network will be denied.
It is this very specific outside access to a PC that will prevent LT to make its way into a home computer in the near future. Services, perhaps an antivirus-software service, could configure protected partitions with information the user isn’t aware of. According to Grawrock, such events will bring privacy concerns: While the consumer could always trust the service he is signing up for, it still interferes with private space. Grawrock believes that a consumer LT will look very different than the LT announced during IDF. “LT is certainly years away from the consumer PC,” he said
If LT can control which data is accessed, then this of course also implies digital rights management functionalities. And Grawrock conceded that the technology could be used for protecting digital content, but he stressed that LT was never developed for this purpose: “It is really designed to prevent software attacks.” LT provides increased and more “convenient” security due to its hardware components and therefore could be used for DRM, but “LT does not provide 100% security. Nothing does,” he said.
Even in the corporate world, where LT may provide a real security benefit for networks, he considers LT to be just “model” for now. “We will need the feedback of IT departments that are running LT to improve the technology,” he said. “We are going into security and we will be hitting privacy issues. But it’s a journey and we will be getting better as we go along.”
According to Grawrock, processors with LT capability will be availability in the second half of this year. Current Intel roadmaps do not list LT as a feature of upcoming processors. But given the technology’s target audience, we assume that it will aim for the Averill and Averill Pro business platforms for client computers, which will include Pentium D 900, Pentium 4 600 and Conroe E4000/6000 processors and the chipsets Q963/Q965 as well as 946 and 975X.
Intel’s LaGrande trusted platform steers away from DRM