AV connectivity in 2011
A little background
I have recently been looking at getting a new LCD display as I wanted a similar one to my current Dell P2310h (great monitor) so that I could run them in dual screen. I did find one - the Asus PA238Q LED backlit IPS display. This monitor was much more than I had originally wanted, but in terms of both appearance and connectivity, it looked identical to my Dell. Therefore, it had to be this monitor. This monitor is important to me as it will be coming to university with me later this year (the desktop computer won't be however, a laptop is fine for me here). After a few days of running a dual monitor setup, I got a bit sick of it to be honest. There were wires everywhere and I just suddenly felt that I no longer needed it.
The original purpose of my dual monitor system is to try out my new software I am developing. That wasn't the only reason behind purchasing a new LCD, I mean my previous one was beginning to date quickly and I really needed an IPS screen for portrait mode. But alas the setup and the Dell P2310h which I loved, is no more.
The battle still rages on...
As many of you will know, my primary interest in computer science has always been, and probably will always be, connectivity. Since the days of my first interest in computing, interconnecting all sorts of hardware has always been something that interests me. I don't know why, but it does.
So today's article will not focus on the new LCD monitor I have just purchased but on the way connectivity has gone. The Asus PA238Q is an example of argument. It features the perfect marriage of four video connectors, two of which also feature audio (uses an audio output connector so the monitor bypasses the sound to the speakers). The four connections are VGA, DVI, HDMI and DP. The order given there is chronological in terms of when they were released. I am going to look at all four of these connections in this review and discuss them in further detail, but first, you must understand a bit of history.
A graphics card featuring DVI, VGA and HDMI.
The purpose of VGA was originally to standardise display connectors under a unified, high quality and flexible connector. The connector was used so widely that dropping it when it's successor came out became something that only high end users could afford to do.
And then along came DVI
VGA or Video Graphics Array was a specification created by IBM for their PS/2 range. It is able to drive a monitor to produce high quality, analogue graphics. It was created all the way back in 1987. So why do we still use it today one might ask? The answer is something that I do not yet understand.
Legacy interfaces such as VGA, DVI and LVDS have not kept pace, and newer standards such as DisplayPort and HDMI clearly provide the best connectivity options moving forward. In our opinion, DisplayPort 1.2 is the future interface for PC monitors, along with HDMI 1.4a for TV connectivity.
This quote from Intel exemplifies my thoughts exactly. Back in the high day of VGA D-Sub connectors one would simply connect up a monitor and a set of speakers to have a multimedia computer. However, due to the fact that VGA connectors would use an analogue signal, interference would occur on the cabling.
Digital Visual Interface
DVI is a standard which with the enthusiasts refuses to go. I still prefer to use DVI to HDMI as the connector uses the screw in design, which makes it robust enough that it will not fall out when I switch to portrait mode, but still provides the very important digital signal of the HDMI connector. The DVI connector supports most of the amazing features of the newly emerging AV connector known as HDMI. DVI still provides one of the best option in my opinion for a computer designed for high definition computers.
I would say that there is no worry if you only have HDMI or DisplayPort, but the latter has issues which we will come to soon. DVI still provides the majority of the features that HDMI provides, save the audio channel and CEC. It also lacks the bandwidth that HDMI has. A Dual Link DVI connector will push the maximum display resolution at 60Hz to 2560 by 1600 from 1920 by 1200, which although very impressive - is much lower than the HDMI counterpart.
HDMI and HDCP
The AV market has changed recently. It moved from the older standards such as composite video, SCART and component video to the new HDMI standard without a blink of an eye. It was a quick move, and now you are seeing less of older connections and more of the new. HDMI stands for High Definition Multimedia Interface and has been with us since 2003, but it only kicked into the market in 2007. My first LCD HDTV only had component - no HDMI, but DVI. About a year and a half later I traded the old Samsung SyncMaster 940MW for the newer Samsung M87. I really wanted a Full HD HDTV. I also was sucked in by HDMI, especially as version 1.3a supported CEC. The M87 by the way, featured a total of 3 HDMI connectors - which was sufficient for the time, now four is just enough for me.
HDMI is in my opinion great - it's compact, high res, it carries audio, allows control of devices, and above all one connector. It just doesn't fit into the computer market like DVI did. It makes LCD displays expensive because of royalty fees, whereas DVI and DP don't have these. This is ultimately what I am trying to get across with this article; that HDMI is designed for the AV market. So for us who have an LCD with HDMI, the chances are we're not using for a computer, but for a games console or the like - I know I am. HDMI also doesn't feature a clip system like DP - or at least I could not find one.
What does this mean to the average user? Well that much is obvious, more expensive displays with HDMI.
What about DisplayPort?
As great as HDMI is, it costs more to implement. This doesn't mean that I won't buy a HDMI display, as my display has a HDMI port, and it was something I need for moving away from home - one display for everything and leave my Samsung C750 HDTV at home.
However, DisplayPort is still around. Originally I thought that DisplayPort was going to be a professionally used port only, as DVI was originally, but I was wrong. Very wrong. When AMD (ATI at the time) released Eyefinity they released a statement saying that to run triple monitors or six monitors (or more) from one GPU, they mentioned that at least one of the monitors must feature DisplayPort with their new graphics cards. So this means that you must own a DisplayPort monitor for this to work - adapters don't work at all. This was big news for DisplayPort. Some claim that active rather than passive adapter do work, but it doesn't officially say that they are supported. Perhaps the most interesting factor in DP is the fact that it does not convert videos. The digital signal that both HDMI and DVI adhere to is a signal that in turn the display must convert. With DisplayPort this not the case. DisplayPort ultimately sends a signal as a pure display signal. The display interprets the signal and then uses it to create an image - no conversion necessary. This means that the motherboard within the display is much smaller and therefore (apparently) leads to a much slimmer display - much like the LED displays we have right now.
However, DP has its problems. For those of us who want to watch the occasional Blu Ray here and there; most displays do not feature HDCP. Great that is. Also, most displays don't support audio over DP - although the occasional one (like the Asus PA238Q) does.
What this article is trying to do, is create a simple comparison of the three digital connectors of the current computing age, and show how analogue signals are now totally obsolete.
Throughout the article, I have described the connectors with the aim of coming to a conclusion as to which connector I would use for my display. Looking through all the books and articles I could find, I still come to the conclusion that DVI is still the best connector for my display and computer. Yes HDMI is good for many things, but most of which a standard display won't utilize. With all this in mind, it is the choice of the individual who chooses the connector, and my choice still lies with DVI.
To summarise, VGA has no resolution maximum, but increasing it too much means it receives interference and looses quality really badly. DVI is a digital connection with no error checking, so if a sent bit contains the wrong data, it is not fixed by the error checking, and appears wrongly and its maximum resolution is 1920x1200 on a single link and 2560x1440 on a dual link. HDMI is identical to DVI except it carries audio as well as video at a higher bandwidth and has other features such as CEC. DisplayPort is considered the business end and now a gamer-oriented class connector, with the highest bandwidth available, but does not always support HDCP.