A comparison of next-gen display interfaces
Indianapolis (IN) – The world of graphics technologies continues changing at phenomenal rates. It is often difficult for the average consume to keep up with all of the advancements. Buzzwords like DVI, HDMI, UDI and DisplayPort are all the rage. But, what is behind those phrases and how do they relate to one another? We have compiled the data into one comprehensive, easy to read overview.
If you have been following technology in recent months lately, then there is a good chance that you have been confronted with one of these display interface technologies – and there is a good chance that you have had no idea what these standards are about and how they compare with eachother. A recent announcement of a DisplayPort display by Samsung prompted us to compare formats and compare the standards feature by feature.
Let’s have a closer look.
Each of these buzzwords we hear are connected back to physical pieces of hardware. But it's not just about the hardware. There are standards behind those acronyms which basically correlate to a "means to an end." In this case that is: how do we get a multimedia signals from point A to point B as fast as possible?
There are many mechanical qualities for interconnects which must be precisely defined in a standard. Wrapper names like DVI, HDMI, UDI and DisplayPort all have internal qualities which make them unique and distinct. The definitions of these specs are often hundreds of pages long. They include some things most of us would probably never consider, like the following:
Contact Resistance – How hard do the pins inside the connector physically make contact? How much electrical resistance is there in the connection?
Mating/Unmating force – How hard it is to insert/remove the interconnect?
Durability – After inserting and removing 100 times, how does it hold up? Not just the interconnect, but also the “male and female” components of the pins.
Thermal shock – Suppose you pull your monitor out of your car where it's been sitting all night in the icy cold. When you plug it in, will the interconnect work?
Cyclic Humidity – Over time as humidity levels rise/fall, what is the effect?
Vibration – If there is vibration from whatever source, will the interconnect disengage?
Mechanical Shock – Suppose the interconnect is dropped or dragged along the floor as the monitor was being carried from one room to another. Can it take it?
Electrostatic discharge – Can the device take a powerful jolt?
Other components are more along the lines of what we think of when we consider video interconnect standards. These are things like cables. How long can they be? How much bandwidth is supported? And from the purely end-user point of view, what does the standard do for my multimedia experience?
Each of the standards mentioned so far comprise a family of abilities. On the next page we'll look at some abilities which might be most important to you. You should also be able to see the evolution of progress made over time. DisplayPort, for example, is the newest technology. Naturally, it should be the most comprehensive. But is it? Let's look at the side-by-side comparison on the next page.
Read on the next page: Breakdown comparison chart of DVI, HDMI, UDI and DisplayPort
Breakdown Comparison Chart
|Comparison of Video Interconnect Standards|
|Introduced||Apr 2, 1999||Dec 9, 2002||Jun 16, 2006||May 2006|
|Last Change||Apr 2, 1999||Nov 10, 2006||Jul 12, 2006||Mar 19, 2007|
|Impetus||Visual||Visual/Audio||Visual/Audio||High Speed, Flexible Wrapper for Visual/Audio + Data|
|Controlling Authority||Digital Display Working Group||Digital Display Working Group||UDI Promoters||VESA|
|Type||Proprietary, Free||Proprietary, Fee based||Proprietary, Free||Open, Free|
|Audio||No||8-channel, 192 kHz, 24-bit uncompressed||8-channel, 192 kHz, 24-bit uncompressed||8-channel, 192 kHz, 24-bit uncompressed|
|Data||No||Limited||Limited||1 MB/s dedicated + lane space|
|Note: DisplayPort does not require video or audio data.|
|Security||40-bit HDCP||40-bit HDCP||40-bit HDCP||128-bit AES DPCP & 40-bit HDCP|
|Note: HDCP is a fee-based encryption protocol. DPCP, by Philips, is free.|
|Max bits/pixel||24 (48 is allowed, but not officially defined)||48||36||48|
|Max Resolution||2560 x 1600||2560 x 1600||2560 x 1600||2560 x 1600|
|Note: Higher custom resolutions may also be available.|
|Min Resolution||640 x 480||640 x 480||640 x 480||zero, video data is optional|
|Note: Only computer video modes are displayed.|
|Max Refresh Hz||120||120||120||Variable, 120|
|Min Refresh Hz||60||50||60||zero|
|Note: Interconnects used for TV signals can clock as low as 24 Hz|
|Max Pixel Clock||340 MHz (in dual-link mode, 165 MHz in single-link mode)||340 MHz (in dual-link mode, 165 MHz in single-link mode)||At least 414 MHz||At least 450 MHz|
|Min Pixel Clock||25.175 MHz||25.175 MHz||25.175 MHz||zero|
|Max Bandwidth||3.96 Gbps (10.2 Gbps in dual-link mode)||3.96 Gbps (10.2 Gbps in dual-link mode)||16 Gbps||10.8 Gbps|
|Rigid Clock Signal||Yes||Yes||Yes||No|
|Audio Included||No||IEC 61937, up to 6.144 Mbps||Indirectly (via HDMI)||IEC 60958, up to 6.144 Mbps|
|Signal Repeater Defined||No||Yes||Yes||Yes|
|Note: All standards support video modes below 1080i.
It should also be noted that each specification is nearly identical in theoretical maximum limits to the others. The differences in the number of pins and defined implementation are the only real limits. DisplayPort is the most flexible definition because of its packaging system for data. It speaks to the future needs of variable payloads, and not just audio and video. The other standards could also be redefined to move data in this way, but are not currently.
|Supporters||Intel, Compaq, Fujitsu, Hewlett Packard, IBM, NEC and Silicon Image||Hitachi, Matsushita, Philips, Silicon Image, Sony, Thompson, Toshiba||Apple, Intel, LG, National Semiconductor, Samsung, Silicon Image||Agilent, AMD, Apple, Dell, Hewlett Packard, Intel, Lenovo, Molex, NVIDIA, Philips, Samsung, several others. Officially supported by VESA as the new standard.|
Read on next page: Background - Display interface considerations and standards
So far we've learned there have been many different video modes evolving over time. Each of those video modes adhered to not only a visible standard on-screen, but also to an electrical one. The electrical standard dictated how those video signals got from the video card to the display.
In the past we've seen evolutions in interconnect technologies as well. We started with RF signal cables and 9-pin forms for MDA, Hercules and CGA. Later, the VGA brought us a 15-pin standard form used for 20+ years. But video demands and abilities are increasing almost exponentially. We're moving away from analog signals to pure digital ones. This means new interconnects, new standards, new proposed solutions. And each one listed in this article has people wanting it to be the one adopted.
This decade has seen many advances in graphics technology. The big 3D push at the end of the 1990s fueled an entire industry toward performance and new abilities. It all means more visual data to process in real-time. To accommodate that explosive need today we have several interconnect options. DVI, HDMI, UDI and DisplayPort. It's a veritable alphabet soup! Let's look at some of the basic qualities of the video interconnect standard to see why they're desirable.
Lossy or Lossless
These are two basic types of video format. Lossy takes advantage of the eye's ability to be tricked in certain ways. It removes some information from images (such as colors, detail or contrast), hence its name: lossy. It creates a visual data item requiring less space to store, but one which can be viewed without apparent or significant visual loss by the user. This is how formats like JPEG gain their high compression ratios. MPEG works similarly for moving pictures (MP4).
The other form is called lossless. Lossless images are always conveyed exactly. This often comes at the expense of a lot of unnecessary visual data the human eye can't really see. But, when you're dealing with video transmission at scores of frames per second (60-100), lossless images are desirable. The only inexpensive and reliable way to convey lossless images today for video is digitally. So any future standard must include a solid digital component.
Independent of display
One of the advantages of a video transmission standard is that if it's defined and employed properly, it doesn't really matter what's generating or receiving it. Each piece of equipment simply does its part, knowing that if the other piece of equipment is also doing its part then the system will work. This allows video cards to drive capture devices, projectors, LCD monitors, CRT monitors, etc. It's independent because it's based on the standard. This also reflects the importance of the standard's underlying reliability and ease of use. Multiple things will need to be driven in the future. This means we need simple, easy to use adapters and cabling.
Support of VESA standards
The Video Electronics Standards Association, or VESA, has created some basic communication protocols which convey a type of “image meta data.” This data is transmitted back and forth from source to receiver. This is one way modern operating systems can automatically determine what monitor is connected to a machine. By following and utilizing these standard protocols, a wider product acceptance is had. All modern interconnect standards follow VESA.
VESA provides the DDC (Display Data Channel), EDID (Extended Display Identification Data), VSIS (Video Signal Standard) and DMT (Monitor Timing Specifications). These are all used to convey information about what both the source and receiver are capable doing and what they are doing.
Plug and Play
The PnP model is really quite a thing. It's much more complex than most people realize. Thanks to VESA support, there are automatic queries which are made when newly identified display adapters are plugged in. These queries instruct the video card to alter the data it generates (the video signals themselves). This alteration compensates for known limitations of whatever the display technology happens to be. The video card makers have often gone to extremely great lengths to ensure their products provide the richest possible colors for the end-user. Were it not for these built-in PnP abilities, many of our monitors would look far worse than they do when we plugged them in the first time. All modern interconnect standards support VESA, and therefore PnP.
Most gamers will view gamma correction as the ability to change brightness or contrast, allowing previously invisible game components to be visible. However, gamma correction in video technology is actually a science in and of itself.
Software developers use precise mathematical formulas to determine colors. This is extremely convenient because those formulas are perfectly linear in nature. This means a programmer sees a value of 5 as being exactly half the intensity of the value 10 (on a 1-10 scale). But the realities are that video and display technologies do not relate so perfectly to mechanical hardware. A signal for a given intensity typically needs to be more than the square of the signal's linear strength to achieve the correct brightness on the end device. The video standard employs gamma solutions which compensates for this electrically. The interconnect architecture must allow for these powerful signals at the fast pixel clock rates high-end users desire.
Update: A commenter named Kirmeo wrote in with a more detailed explanation of the gamma correction process and the reasons it exists. He also recommended a book called "Transmission and Display of Pictorial Information" by D. E Pearson (ISBN-10: 0470675551, ISBN-13: 978-0470675557, about $65) for anyone wanting the full low-down on this kind of technology (only available in print). He describes the gamma process as the reduction of noise in lower-strength signals (darker colors). By allocating more of the signal's avaialble bandwidth for the lower portions of the spectrum, less noise winds up being visible in those signals. The gamma correction logic relates to this process, as well as re-normalizing this signal on the receiver. In addition there are non-linear attribute characteristics for physical displays. These are typically handled completely transparently and entirely by the display device itself. Also, the human eye does not perceive light intensity in a linear manner and gamma accounts for that. All of these are factors are employed when taking the programmer's ideal 1-10 scale and making it visibly appear in a 1-10 in brightness as was desired.
There has been a huge push toward fully digital signals in recent years. They are the only true way to insure lossless video conveyance. The future is definitely headed toward being completely digital. All standards developed in the last decade have included that forward-looking reality. And any interconnect standard which is chosen must support digital multimedia without question.
If digital is the new thing, then why support analog at all? A lot of the modern specs we have today are primarily digital. But, they also allow pass-through analog signals for backward compatibility. They do this to save us money. Many of us probably have analog monitors. We probably also have video cards capable of emitting full digital signals. However, because the huge monitor base out there does not have full digital abilities, analog is still used even on brand new cards. So, we plug in our dongles.
Because there is such a huge base of analog monitors out there, any interconnect standard must support analog signals. It will not be widely accepted otherwise.
It might be surprising to learn that interconnect standards contains color depth limitations. Whereas both the video card and user might want to generate 48-bit color data, the physical copper wires can't always convey that much data over long distances. High color depths can be conveyed via any given interconnect, but it often requires additional interconnect pins. Those requirements must be written into the spec precisely. Every pin must be defined so that when we plug in our devices they just work. For us, it's completely transparent. But without the data in that spec, it wouldn't work at all.
Different standards allow for different levels of bandwidth. The generally accepted industry base for computers is the traditional 640x480 at 60 Hz video mode. For TVs we have all kinds of standards depending on what you're after. The pixel clock rates for all conceivable frequencies must be accommodated by the spec. These are not things the designers can hope will work. They must know they will work. All standards in place today have rigorous testing procedures by industry experts. These insure that everything which is spec'd out will work like it should.
Free or Open Standard
This is a growing concern in the industry. There are enough talented people working on video standards today that the need for a uniform base is required. So the question becomes: is the specification in the public's domain? Can any corporation just sign up and use it without paying royalties? I was surprised to learn that most popular forward-thinking interconnects today contain some components which are still not free. DisplayPort is the only standard which offer truly free use. The HDCP encryption protocol used by DVI, HDMI, UDI and was recently added also to DisplayPort as an option, requires royalty payments. Philips' PDCP encryption protocol, currently used only by DisplayPort, is not only stronger, but it is also free.
The push is definitely for open, free standards. The newest member of the club and the one recently accepted by VESA as the new standard, DisplayPort, addresses that fact throughout its entire design.
Read on the next page: Background - video standards
To understand why we are at a particular place, it's often very beneficial to look back at the steps which got us here. We didn't suddenly arrive at the point where 1080p was highly desirable for no reason. We learned over time that the human eye has a particular interest in seeing better quality images. And the tradeoff we have today is cost for quality. 1080p is a good, happy medium and a quality that's likely to remain near standard for at least several years.
To understand what has happened to get us here we must go back to the very beginning. Consider the baseline evolution of the processing computer. The earliest machines were not at all like the computers we use today. The early movies aside, displays back then were little more than indicator lights. They would signal things like: Was there an error? To find out the operator need only look for a flashing red light near a permanent label reading “failure” or “error”. For the original machine experts of that day, this was sufficient. They could do many jobs much faster with computers than without. And that was even without proper displays. But, the truth is we are visual creatures. And we wanted more.
MDA, Hercules and CGA
The first general purpose display adapter standards which received wide acceptance for PCs came just after the original 8088 CPU was produced. They were the MDA, CGA and Hercules Grahpics Adapters.
The MDA allowed text only, via a 9x14 pixel box character. It had 80 columns across and 25 lines per screen. It used 4 KB of display memory per page. It was sufficient for 256 uniquely defined characters (the ASCII character set) and limited attributes such as: invisible, underline, normal, bright, reverse display and blinking. This early text was seen primarily on green phosphorus monitors. It looked like this:
The Hercules Graphics Card was a notable monochrome advantage. While still working with the MDA-compatible technology, it now allowed pictures. Comprised of 720 x 350 pixels per page, individual dots could be turned on and off as necessary allowing simple graphics. It required 32 KB of memory per page, and allowed for two pages of display memory with a typical 64 KB setup. Each character of display memory represented 8 on-screen pixels in a horizontal line. The Hercules graphics card used an 8 KB stride for display memory. This meant the display image was not arranged internally in a contiguous set of bits. It also had a limiting factor in that bits were either on or off in graphics mode. There was no bright or dim attributes per bit. Only a black screen with green dots arranged in some fashion. While it would certainly be undesirable compared to what we use today, at that time it was very powerful. It looked like this (note the true monochrome image with no bits begin brighter or dimmer than others. For those effects, dithering was commonly used):
The Color Graphics Adapter (CGA) came about the same time as the MDA. It allowed for the same basic abilities but with colors. Video modes of 160x200 with 16 basic colors were possible, comprised of the various Red, Green and Blue combinations. And a super video mode of 320x200 with four colors was possible. In text modes, each character on screen was tagged with an attribute byte indicating its foreground and background colors. This allowed for 8 different background colors, 15 foreground colors and either blinking or not blinking. The CGA adapters had a notable problem with the way they accessed memory during refresh cycles. As a result, special programming logic had to be used keep (what was called) “snow” from appearing during direct screen writes. The CGA allowed for simple graphics abilities and many, many early video games were created on the original 4.77 MHz 8088 microprocessor. The CGA looked identical to the MDA, except that it had RGB.
The Enhanced Graphics Adapter (EGA) came out shortly thereafter. It was really this card that began to take things to a new level. This was the first time where images could be displayed which began to look real. There were 16-color modes available in both 640x200 and 320x200, along with a very powerful 640x350 mode with 16 colors (chosen from a palette of 64 colors). Other EGA shoot-offs improved on the standard over time and gave us 640x400, 640x480 and even 720x540 video modes. These efforts revealed on thing clearly: people wanted better graphics.
Video Graphics Array (VGA)
Most of us would consider this VGA to be the true baseline. This standard was introduced and put graphics on the map. It set us on the path we have taken to be right here today. We're viewing this page over the Internet using technology which superseded the VGA.
When we install new versions of Windows, Linux or other graphical operating systems the default video mode used is often VGA. It's a 640x480 16 color display mode which is easy to program and resides within a single 64KB chunk of memory beginning at 0xa0000 on a PC. The VGA also introduced the highly enjoyable 320x200 256 color mode (13h) for games. This also fit in the same 64KB chunk and allowed for the popular early 2D games like Commander Keen.
The VGA standard provided a baseline which still needed to be extended. The technology was somewhat slow to catch up with desire. This was due primarily to cost. However, many different video card makers tried to get their standard out front.
Super VGA (SVGA)
The SVGA standard came in the late 1980s but wasn't widely adopted for a few years. It initially gave us the 800x600 16 color video mode but was later extended to allow the standards we see today: 1024x768, 1280x1024, etc. Based on the amount of installed memory in a video card, different levels of colors could be displayed. These were typically 256 colors. However, memory limitations often resulted in the highest video mode for a given card only supporting 16 colors. This was very common in the early 1990s.
It was also at this time that the VESA standard came out (Video Electronics Standard Association). This was a double-edged sword/blessing. The benefits of VESA came from a BIOS standard which made the same binary code work with multiple SVGA adapters. One program needed to be written which allowed graphics on multiple cards. But, the downside there was that VESA was painfully slow.
At that time VESA used a windowed “looking glass” read/write method into 64KB banks of total video memory on an ISA bus. That bus operated at a maximum of 8 MHz. Using 16-bit reads and writes, that meant a limited amount of bandwidth for drawing. It also meant a lot of bank switching for higher video modes requiring much more than 64KB. This made it all but completely unusable for any type of serious graphics work. To utilize the more advanced capabilities, most cards had specialized, faster graphics. But this also meant that to use them custom drivers were required. This created a lot of compatibility problems.
XGA, SXGA, UXGA, QXGA and HXGA
After SVGA came the more modern standards we see today in PCs and CE devices. The XGA, SXGA, UXGA, QXGA and even HXGA are all the rage. These new logical standards allow for some truly phenomenal graphics resolutions. Some of the highest-end modes (like HXGA) define a maximum standard of 7680x4800 pixels using an 8:5 aspect ratio. That's 37 million pixels per image with varying color depths! If that video mode were processed using 32-bits per pixel the way single-output graphics cards do today, it would require a GPU clock in excess of 8.6 GHz with more than 144 MB per frame! Fortunately, those high-end standards today are really only used for really high-end single-frame digital cameras.
The reality is still this: While these display standards are very nice and will provide the user with high resolution graphics. There still has to be an underlying mechanism which takes the data from point A to point B. And that's what this article is talking about. It's not enough to have the logical ability to do something. There also be the physical interconnecting architecture which allows it to happen. And the standards explained in this article serve to demonstrate for themselves why a particular standard is more desirable than another.
The lesson here is that each of these video standards came about for a particular reason. They were true standards at the time and well thought out. We have now moved from computing machines that saw no TV-like displays at all, to machines which can literally take our breath away with stunning graphics.
And one thing about that trend is an absolute certainty: the video needs of tomorrow will be ever increasing. Higher resolution, greater bit depths and refresh rates. The human eye is amazingly adept at distinguishing real looking images from false ones. We are very good at seeing fake looking images. The graphics hardware designers of tomorrow are working toward keeping our eyes fooled. They are addressing that very fact not only with new products and software standards, but also by the physical interconnects being chosen.
We want to see full realism employed at all levels of our user experience. In our PDAs, cell phones, notebooks, desktop machines, even home entertainment systems. And none of our wants are immune to the goals of the hardware designers.
Read on the next page: Author's Opinion
We are seeing the computing world change today. Virtualization is key. As technologies move forward, it will no longer truly matter what medium or method is employed under the hood. We are approaching the point where all we need to know is that a key goes in the ignition and the thing starts. By operating common controls, whatever's under the hood can be wielded. In that regard, the key sould be our data. The vehicle would be whatever it happened to be. And the common controls we're familiar with will be the gas pedal, shifter, turn signals and door handles. These would equate to the video modes, refresh rates, audio channels and so on.
Protocols and standards like DisplayPort are exactly what this industry needs. We need free, flexible, dynamic solutions in all aspects of compute design. We need to move into the area where the absolute maximum capabilities we envision are handled by the support infrastructure. Only then will we be able to move out of the world of the limitations of hard mechanics, and into the world of imagination.
In the early days the MDA, Hercules and CGA, standards were limited by many hardware things. Memory, cost, wide technology availability and even R&D knowledge. But today, with so many brilliant minds working in these fields, that time is past us. We have essentially mastered this art. We have the controls at our disposal to provide stunning visual and audio experiences via a common interconnect standard. What is needed now is the ability to step back from that hardware perspective. We need to look at that machine and say “Here's what we have. Now, what can we do with it?”
I believe this is the most exciting time we've seen in the semiconductor history. We stand at the absolute threshold of across-the-board virtualization. There are tremendous compute abilities being born today. We have communication platforms operating in the terabits per second range. These will be available to us, the end consumers, in just a few years. Our software is beginning to mature in its model. We have standard frameworks which have shown themselves to be desirable over time for both development and maintenance. And we have a set of human resources growing up right now. They're graduating from college having never known what the world was like without computers.
When I think about where we came from, where we are now, and where it is we are going... I conclude that it is perhaps the most exciting time in man's history. The future is absolutely wide open for human potential and achievement. We will no longer be limited by the things which have taken us so many years to nail down. Communication, compute abilities, and proper use of that technology. This “culmination of the thing” is the one component we have yet to master. And right now, our focus should be not only on the underlying technologies which will get us there. But which technologies will best serve us when we are there.
I look forward to reading your comments below.
Updated: One final thought. A few commenters have asked for specifics about why a particular standard should be included or chosen over another. The truth is any of the video interconnect architectures that are present today could handle any workload we would like to pipe through them. They could handle, video, audio and data without any problem whatsoever. The differences come from the protocols, or internally defined specs indicating how the physical hardware can be used. This is why DisplayPort has such an advantage. It is an open standard which allows for nearly anything to be communicated.
One way to visualize DisplayPort's potential and promise is like this. And, if you'll forgive me we'll use a "tube" reference. DisplayPort defines up to four "lanes" for data communication. Each lane operates at a given speed. For our example here we'll just say that means different length tubes. The DisplayPort protocol wraps data items into packages and sends them through the tubes. A package might be some video data, or some audio channel data, or just some regular data. Each package is inserted into the tube right next to the other one. The reason DisplayPort offers four lanes is because with large payloads the packages get big and quickly fill the tube. As a result, DisplayPort sends individual packages from A to B. B receives them and, based on known protocols defined in the spec, decodes the packages and then acts accordingly.
This is one of the biggest advantages DisplayPort offers over the other standards. The physical interconnects used by each of those standards (DVI, HDMI and UDI) could also accommodate this ability. However, they're not currently defined to do so.
So to answer many people's question: Which technology would give you the best bang for your buck? If you're looking to the future conveyance needs of more than just audio and visual data, then you really have only one choice: DisplayPort. This is also a main reason why VESA has just endorsed it as their new supported standard. I believe we'll be seeing all major manufacturers jumping on the DisplayPort train very quickly.