Do quality HDMI cables make a difference?

After Ed’s post on whether digital is perfect, we received an impressive response from readers who voiced their own opinions about the subject. As requested by many readers, I’ve decided to put this theory into practice and explore the technology behind HDMI and the quality of HDMI cables.

DISCLAIMER: As a real engineer could understand, I cannot possibly fit into a blog article the holistic theory behind digital data transmission. Some bits are over simplified to allow a layperson to understand technical concepts easily.

Can errors exist in HDMI?

The first question we need to answer in this quest for truth is this: can errors exist in HDMI data transmission? and if so, what is the likelihood of this occuring? To demonstrate this, we’ll need to take a step back and have a look at basic digital transmission theory. Much of this is explained in more detail in the digital lies post, but we’ll look over the essential ones again.

Digital Transmission Errors

This is how most people think digital transmission works:

Signal transmitted:

1, 0, 1

Signal received is either:

1, 0, 1


A random set of numbers (or not at all).

In reality, this is not true.

Firstly, signals aren’t just transmitted as 1s and 0s. 1s and 0s are the symbols we’re transmitting. 1s and 0s don’t travel down copper wire. However, electrical current does, and hence we represent the 1s and 0s with, say, voltage.

This is what a transmitted signal (and the corresponding bit it represents) looks like:


Digital Signal

Unfortunately, even the best cable in the world isn’t perfect. Chances are, there will be noise to some extent, and a good received signal may look like this:

A Good Received Signal

A Good Received Signal

The received signal is in red, as compared with the original in grey. Notice how noise will have an effect on the signal received. (Interesting fact: we also note at this point that fundamentally everything is transmitted and received in analogue, because in the real world there is no such thing as digital. Digital is just a syntax of communication. Analogue is the medium.)

Signal Decision

Signal Decision

In this case it’s fairly easy to decode the original signal. We can simply take a sample of the received signal in the middle of every bit and check if it’s higher or lower than the middle. We take a “peek” at what the signal level is at each of those vertical black lines. We compare this level with the middle, and make a decision as to whether it is above or below. As we can see above, there would be no errors in this case.

Unfortunately, cables are much less than perfect. As technology improves (HDTV, DVD, Blu-Ray, etc.), more data is required per time period. I.e. higher bandwidth. This means that we need to send bits faster.

Notice how there is an instant jump in the signal from 0 to 1, but there is a slight delay in jumping from 0 to 1 in the received signal. This is because cables don’t have infinite bandwidth. Now imagine the same bits being sent, but at 1/1000th of the speed of above, and with a cable that is slightly worse than above. What would the received bits look like?

Poor Signal

Poor Signal

This may be something similar to what’s received. The signal exhibits much noise and distortion. And normally (if this were analogue), it would be nearly unusable. But because of the digital scheme of transmission, we can still decipher the signal without error (using the above method of sampling/decision making).

But what happens if a stray bit of electromagnetic radiation hit the cable at the 3rd bit? Whoops, we just got an error.

Signal With Error

Signal With Error

The received bit would now be deciphered as 1, 0, 1, 1, 0, 1, 1.

EM radiation, power supplies, adjacent cables, even the earth’s magnetic field produces noise. And if a cable is poorly built, we can see even more errors being received.

So now we’ve established how errors can exist in digital transmission, let’s see what happens if it occurs in HDMI.

Digital Errors in HDMI

Fortunately, there are mechanisms outside of the transmission itself to reduce the impact of errors. In fact, when a bit error does occur, you will most likely not notice it happening. The effects of bit errors can be significantly reduced through digital coding mechanisms, as well as parity bits and error checking mechanisms. I won’t repeat on how this works in detail here (see digital lies post for details), however essentially the mechanism will detect when a bit error occurs, and when this happens, it will substitute a signal which it guesses (usually closely) to what it probably was. These processes are called channel coding and interpolation.

“It’s digital, it’s either perfect or nothing at all”

This is thus, a blatant lie from those who do not understand fundamentals of digital transmission.

Do HDMI cables exhibit errors?

So now that we’ve established that it could happen, let’s have a look at whether it really would in real life. Remember from the above, a few bit errors in the millions of bits transmitted per second would not be obvious to observer (by obvious, I mean “mosaic” style errors in pictures, clicks and pops in sound, etc.).

To achieve obvious errors in HDMI, the signal must be distorted to such an extent that multiple bits per bit words are received in error (to render parity bits useless). I.e., it will need to have a significant bit error rate for an extended period of time. Does this happen in real life?

Cable length

Cable length restrictions is strong argument that HDMI can and probably exhibits bit errors in real life. Most 2m cables perform satisfactorily. However, extend these cables to 5m+, and things start to go pear shaped. Obvious errors start occuring. Yes, these are the multi-bit-per-word errors that can cause clicks and pops in sound, and little squares in the picture. Now I’m not suggesting this is proof, but from my experience in digital transmission, if an increase from 2m to 5m can introduce obvious errors in transmission, it’s a strong argument that smaller errors occur quite frequently.

Hell, even the HDMI parent organisation has a standard of 10^9 BER (bit error rate). I.e. one error per 1 billion bits. At HDMI’s frequency, that’s one bit error every 6 seconds.

HDMI is a one way protocol

“If HDMI cables can make a difference, why is there no high end ethernet cable?”

Because HDMI is a one way protocol. That means the data travels in one direction, and there is no response from the receiver. Therefore, the source has no way of resending data if it is corrupted. TCP/IP works by breaking data into packets, and resending packets if received corrupted. However to resend data, the sender must know that the original was received corrupted, and for this to occur, the receiver must be able to talk back to the sender. This can’t happen in HDMI.

Manufacturing standards & material efficiency

The art of engineering is not to achieve perfection. It is, rather to achieve efficiency. To approach perfection requires an exponential increase in resources. However a good engineer would simply approach it as close as possible with what’s available.

A cable manufacturer’s goal is to sell as many cables as possible. To do this, they must have the lowest price possible. To get the lowest price, they will save on as much material as they can. How much material can they save? What controls are in place to ensure that this happens? Well, nothing, actually. There is no enforced worldwide standard of HDMI cable making. Although HDMI is licensed, there is no real control mechanism for the standard of manufacture. Cable makers are fairly free to do whatever they like.

As a result it’s possible that there may be cables out there that don’t even meet the standard. There can be cables which introduce hundreds of errors every second, but the consumer is none the wiser. (This is partly an advantage of HDMI). However, by no means is HDMI perfect.

Practical Considerations

The point of this article isn’t to say that a $40 cable is better than a $10 cable. It’s not even to say that people would care or notice the difference, or that when small errors occur whether the difference is noticeable. It might very well be impossible to notice them in a side by side comparison.

The point I’m simply trying to make is that it is not impossible for there to be a difference, and that people shouldn’t believe whatever rubbish that gets posted all over forums. Everyday, people all over the internet talk about digital signals as though they’re experienced cable designers. I’m not a cable designer. I’m not even a real expert in HDMI technology. Hell, what I’ve just said might be complete rubbish as well. But why take it as gospel without at least doing some research? Google a few terms that I’ve mentioned. Look it up on wikipedia for 5 minutes.

Digital is far from perfect. Errors happen all the time, in every cable, in almost every instance. Should you spend $200 on a high end HDMI cable? Probably not. Once a digital cable reaches a certain quality (negligible error rates), it is nearly impossible to improve on it. However, I wouldn’t be at all surprised if a $5 cable made in China isn’t made to specification. And although it “works”, it doesn’t mean it’s error free.


There are people who may be happy with the cheapest cable on the net, much like there are people who are happy to listen to mp3. For me, personally, I’d fork out the extra $20 to buy a reasonably good set of cables to hook up my TV, knowing that my $4000 TV is getting its full use. Sure, the $5 version might be just as good. But for the extra $20, I’ll think of it as insurance.

Follow up

The following is taken from the FAQ section of

“… It is not only the cable that factors into how long a cable can successfully carry an HDMI signal, the receiver chip inside the TV or projector also plays a major factor. Receiver chips that include a feature called “cable equalization” are able to compensate for weaker signals thereby extending the potential length of any cable that is used with that device.

With any long run of an HDMI cable, quality manufactured cables can play a significant role in successfully running HDMI over such longer distances.

As you can see, the performance of HDMI cables goes far beyond the simple “if it works, it works” statement.

… there may be instances where cables bearing the HDMI logo are available but have not been properly tested. … We recommend that consumers buy their cables from a reputable source and a company that is trusted.

I wouldn’t be surprised if the majority of the cheapies on ebay aren’t certified, or have dodgy certification. Hey, 99% of them might work great. But there’s also a chance that a large majority of them don’t.

  • Share/Bookmark
  1. #1 by Alex on December 9, 2011 - 7:43 pm

    “HDMI is a one way protocol”

    Yes and no. As of HDMI 1.4 (2009), there is bidirectional communication (ethernet, audio return), but it’s not generally used for error correction of audio/video transmission.

  2. #2 by Michael Ironside on June 7, 2012 - 12:29 am

    This is a stupid article that is inconclusive. Why didnt you simply line up a TV set and test the cheap and expensive HDMI cables and then assess the picture quality? The use of words “might” and “may be” are annoying.

  3. #3 by dan_mason on July 5, 2012 - 9:39 am

    Michael, you’ve completely missed the point. Firstly, people talk about digital as if it’s error free when it’s not. I’m trying to explain what happens in the real world. Also, putting pictures side by side is hardly a scientific test. Yes, practically this is what’s important (what looks good to our eye), but again, that’s not the point of this article. I don’t need to test cheap and expensive cables. This has already been done. There are cheap cables out there which are bad to the point of pixelation. I’m addressing the more technical aspect of the problem that people THINK they understand. Of course it’s inconclusive. We’re not putting two cables against each other. I can’t speak for every cable manufacturer in the world about what their cables are like. “Might” and “may be” are appropriate words, compared to idiots out there who make wild claims about “if it’s digital, it CAN’T make an error”. Nonsense.

(will not be published)