• Sign in to Neowin Faster!

    Create an account on Neowin to contribute and support the site.

Sign in to follow this  

Monster Cable HDMI Scam

Recommended Posts

KevinN206    46
The corrected signal is the original signal. There's no such thing as approximations in digital error correction. Either the thing corrects to its original form or not at all.

In case that flies over your head, the error correction data doesn't come from anywhere. These 8 bits that are being transmitted are being padded with additional data for error correction. It doesn't come out of thin air. Like the CD. On disc, a byte is actually 14 bits. 8 bit data and 6 bit for error correction. On DVDs it's still 12bit plus a better ECC.

And on transports like HDMI, it's similar.

This isn't like the analogue world, where you can interpolate things out of nothing. A single broken bit in a MPEG stream means a broken macro block or motion vector, doing more damage than a single pixel.

I think this explanation here hits the nail. There two things we have to consider: error detection vs. error correction. Error detection does not allows the receiving device to correct error (as in recreating the original signals), as its name implies. Things such as parity and checksum are error detection methods. They are usually used to tell the sending device to resend a packet if checksums do not match on the receiving device. Error correction detects and corrects (to a certain extent) signals, but it adds more overhead.

Digital signals sent across a single cable (for HDMI spec, I believe) is either 5V (logical 1) or 0V (logical 0). But the real world doesn't give us those perfect voltage levels when switching. In fact, the signals degrade due to intrinsic resistance, inductance, and capacitance within the wire (there are many models of a 'wire'). This becomes more pronounced as we increase the wire length and switching frequency.

Because capacitance often creates slew in the signal (i.e. increases slew rate, rise time, and fall time) such that the signal waveform is no longer a "square wave" (see squarewave.jpg). Increasing the rise and fall times create problems in digital signals. This is because we have a certain margin that a signal can be interpreted as a Logical 1 or a Logical 0, known as the noise margin (see noisemargin.gif). A lot of digital signal integrity is dependent upon these voltage levels, wiring length and placement, and external interferences.

post-117866-1229901515.jpg

post-117866-1229901675.gif

post-117866-1229901904.gif

Share this post


Link to post
Share on other sites
Joel    27
Dude, error correction is not based on guesswork, it is based on extra information being added to the signal at the source that can be used by the receiving end to fix errors and reconstruct the signal up to a certain point.

Yes, and the error correction takes some processing.

Look, I'm not going to argue this; you guys will never believe me, and that's fine. I'm not talking about the average Joe's signal, and I'm not saying Monster is the way to go. I'm saying simply that cables can make a difference to the signal. Will you see it? Maybe not. Will you hear it? Maybe not. But don't think that all cables are the same.

Share this post


Link to post
Share on other sites
I am Reid    45
Yes, and the error correction takes some processing.

Look, I'm not going to argue this; you guys will never believe me, and that's fine. I'm not talking about the average Joe's signal, and I'm not saying Monster is the way to go. I'm saying simply that cables can make a difference to the signal. Will you see it? Maybe not. Will you hear it? Maybe not. But don't think that all cables are the same.

they deliver the same result

Share this post


Link to post
Share on other sites
HawkMan    5,232
Have you ever seen a digital input report how many errors PER SECOND it's dealing with when playing a CD?

Yeah those errors couldn' be because CD is a terrible medium. they're manufactured cheap and usually allready have error on the disk, they they get scratched, the surface gets word and the data layer of CD's actually "grows old". non of that could cause errors when playing CD's no... and that's not even takign into acount the not very good CD players, wich oten is unable to spin the disk at the perfect center or just has a crappy laser or any number of other things.

At least if you're going to talk about error per second, use somethingcapable of creating a reliable source signal to start with.

and don't use the crappy TOSLink cable from your previus eample that changed audio quality when you moved the cbale around (Here's a hint, it's broken)

And I've allready said that of course more expensive cables are of higher quality, but at normal cable lenghts of less than 3 meters, it won't matter, a non broken ~15 dollar cable will do the job. But there are situatons where you need a hgiher end HDMI cable of course, but that doesn't mean Monster, it means a quality cable by a company with normal prices at the same quality of Monster, or maybe even beter.

Share this post


Link to post
Share on other sites
zivan56    22
Have you ever seen a digital input report how many errors PER SECOND it's dealing with when playing a CD?

CD is optical media, so it is analog before it becomes digital. The laser that reads the pits on the CD and the CD surface are analog devices. Either way, CDs use FEC (foreward error correction), which basically has redundant data to correct these errors or at least detect them (which allows a re-read before the error is noticable).

Share this post


Link to post
Share on other sites
shakey_snake    1
:I just believe what my eyes and ears tell me over whatever 'science' that anyone else lays down.

Then you are a fool.

Share this post


Link to post
Share on other sites
hdood    145
Yes, and the error correction takes some processing.

Yes, but how is that relevant to anything? It's not something you'll be able to see or hear and not something that will affect the quality of the sound or picture or whatever.

And HDMI doesn't even use error correction, it relies solely on a combination of hardware and signaling features designed to minimize the kind of problems Kevin is talking about.

Look, I'm not going to argue this; you guys will never believe me, and that's fine. I'm not talking about the average Joe's signal, and I'm not saying Monster is the way to go. I'm saying simply that cables can make a difference to the signal. Will you see it? Maybe not. Will you hear it? Maybe not. But don't think that all cables are the same.

Yes, a cable can. In fact, I've stated this many times now.

But to talk about HDMI specifically, the "difference" it can make is limited to introducing errors (either internally or from external interference). If there are none, the signal is identical regardless of the price of the cable, and a more expensive can do nothing to improve the quality. In virtually all cases, the cheap cables are capable of carrying the signal just as well as the expensive cables (although those may very well technically be more resilient to noise, have better build quality, and last a 1000 years.. or, be a cheap cable in a garden hose) which makes all of this bickering over "theoreticals" pointless and the expensive cables a rip-off.

Share this post


Link to post
Share on other sites
+Troll    67

There are a couple points to make here -

1) It was wrong to purposely deceive the consumer into thinking both setups had the same variables.

2) As much as I despise Monster cables and the inflated prices, it doesn't mean they were the ones to hook the system up and it could have been some minimum wage paid high-school kid with no A/V knowledge, who happened to use the only cable available.

With that being said, anyone who purchases a Monster cable is either stupid, uneducated or simply has money to burn. At the price point of Monster cables, there are many better quality cables if you are simply looking for "the best." For similar quality, a monoprice cable is going to work just fine.

Copied from another post of mine here:

Scientific tests have been done using oscilliscopes at Monster's labs by independent testers proving no difference between cheap "no-name" cables to the monster ones unless the distances are very extreme 50ft lengths.

I could get into my long history with home theater setups or technical background, but others much more versed on the subject have proven this many times over. It is no different than the perceived quality some people have of Bose speakers. You are paying for a name and marketing more than the product. The great thing about digital signals is that they are just 1's and 0's. The signal either works or doesn't. The only difference is the composition of the cables and the quality of materials used. Cables of a thicker gauge are generally better and can carry more data. Having better connectors can help too.

People are more than welcome to purchase a Monster cable if the placebo effect makes them feel nice and tingly inside. Honest people who have done research cannot recommend them for the price however as that money could be better spent in other areas.

I'd post links if I wasn't on my BlackBerry now, but if you'd like to get more info, just do a bit of digging on AVSForum and you'll see loads of info. Be prepared for scathing remarks and much laughter if you attempt any recommendation like that however, as a forewarning.

While I'm not on my BlackBerry now, here are some links:

Gizmondo HDMI tests Part 1

Gizmondo HDMI tests Part 2

HDMI Cables - Monster vs Monoprice

CNET HDMI Cable Review

The digital argument works, but also must take into configuration other variables such as lengh, wire gauge, type of materials used - aluminum, copper, silver - and so on. If you want to compare the Monoprice top of the line cable to a Monster cable, here is (I believe) their current top of the line - Monoprice - It isn't cheap, but with silver plated copper, it will be of the highest composition quality widely available. Even so, the composition of the cable in the materials isn't necessarily a big issue, as much as how those materials are used. The thing which, more than anything else, affects cable quality is control over manufacturing tolerances. These tolerances which are important would include consistent and accurate wire diameter and profile, consistent dielectric material properties, consistent and accurate dielectric dimensions, consistent wire spacing and twist tightness in pairs, consistent shield application, control over things like twist rate (important in interpair skew), that sort of thing.

Yes, there are differences in cable and an expensive cable may be better built then a cheap cable. A cheap cable may fail more easily or have less (but not necessarily inadequate) shielding.

But the degradation you notice with a bad HDMI cable will be very significant, such as sparkles, or partial loss of picture. It wouldn't be a subtle degradation that you have to do a side-by-side comparison to detect a slight difference in picture quality.

Thus we have the general statement that, if an HDMI cable gives a normal looking picture with no obvious artifacts, then it's working as good as it can. A more expensive cable or more shielding won't improve the picture.

If you want to buy a more expensive cable for better construction or asthetic reasons, that's fine. But a blanket statement that a more expensive will give a better picture is (possibly) true only in the specific case where a cheap cable is giving a very noticeably bad picture.

Well, "digital is digital," as you indicate. What this means is not that cable quality doesn't matter at all, but that cable quality will not matter in a small, gradual way. Think of it this way. For any given cable, judged over a range of lengths, there will be a "good" zone where, up to a certain distance, the cable performs adequately for the application. Within this range will be very short cables where there is almost no loss or degradation of the original signal waveform, and longer cables where the original waveform is significantly degraded but has not degraded enough to cause the receiving circuit to have trouble interpreting it. If you watch a TV hooked up with the short cable, and then with the long cable, within this "good" zone, you'll see no difference, even though there's a large difference between the waveforms being delivered to the display.

The better the cable quality, the wider this "good" zone will be. Silver-plated wire, for example, by decreasing resistance up to about 5%, will reduce attenuation, and this means the waveform delivered, for any given cable length, will be just a bit stronger. The amplitude of the waveform is one factor in determining whether the receiving device will be able to properly reconstitute the signal. Accordingly, we would expect that if a cable with a solid copper conductor will go X feet, an equivalent cable with a silver-plated copper conductor might go just a tad farther before falling out of the "good" zone.

In other words, it matters, but it only matters at the margins, near what is called the "digital cliff." Just prior to the digital cliff, the signal is degraded, but still perfectly reconstituted by the display. At the digital cliff, as we either diminish cable quality or increase cable length, the onset of failure is quite abrupt. A cable that works well at 50 feet might produce a blizzard of sparkles at 60, and there might be no picture at 75. The object of cable design is to move this digital cliff threshold as far from our practical use distances as possible.

With most digital cable applications, the digital cliff really doesn't kick in until the cable is rather long--this is true, for example, for CAT5 Ethernet, for professional digital video (SDI), and the like. Unfortunately, the HDMI standard was very badly designed and the digital cliff for many cable types falls within the range of practical distances in ordinary use, at least when we start to talk about home video distribution, front projection systems, and the like. It is rare to have trouble at short distances (say, 15 feet or less) but not unheard-of.

So....after a long digression, the short-form answer to your question: at 3 feet, any competently-made cable will perform exactly as well as any other, in practical terms. Were we to take a bunch of these cables and run detailed eye-pattern tests with a signal generator and scope, we would see quality differences, but those quality differences would not affect picture quality.

Share this post


Link to post
Share on other sites
Glowstick    3
Yes, and the error correction takes some processing.

It's built in the digital transceiver chipset/solution. It's free.

Share this post


Link to post
Share on other sites
Joel    27
CD is optical media

So DVDs, HDDVDs and BD are what?

Then you are a fool.

If that were true, the TV with the best spec would always win. And it doesn't.

Share this post


Link to post
Share on other sites
ATGC    40
If that were true, the TV with the best spec would always win. And it doesn't.

People have budgets. Sorry, if everyone isn't rich as you, who are able to buy 100$ HDMI cables because "it feels" and "looks" better than inexpensive ones.

Share this post


Link to post
Share on other sites
Joel    27
People have budgets. Sorry, if everyone isn't rich as you, who are able to buy 100$ HDMI cables because "it feels" and "looks" better than inexpensive ones.

Have you read any part of this thread? I'm not rich, I've said Monster are overpriced and I've said my setup doesn't merit testing a better cable. What I have said is that the thinking of "1 cable is as good as the other, there's no possible difference" is a falsehood.

Share this post


Link to post
Share on other sites
roadgeek9    0

It's kind of like Samsung being the official HDTV of the NFL. I have a Toshiba HDTV, and the NFL games look perfect on it. The main issue there is the channel. For example, games on FOX and NFL Network are kind of more pixelated than the games on NBC, CBS, and ESPN.

Also, my father and I got this $9 generic HDMI cable, which works perfectly with our cable box.

Share this post


Link to post
Share on other sites
ATGC    40
Have you read any part of this thread? I'm not rich, I've said Monster are overpriced and I've said my setup doesn't merit testing a better cable. What I have said is that the thinking of "1 cable is as good as the other, there's no possible difference" is a falsehood.

It is not a falsehood for digital cables.

Have a read here: http://boardsus.playstation.com/playstatio...=828972#M828972

Share this post


Link to post
Share on other sites
TheTempestSonata    4

I honestly think that's more about Best Buy's integrity than Monster's

Share this post


Link to post
Share on other sites
DaDude    46
People have budgets. Sorry, if everyone isn't rich as you, who are able to buy 100$ HDMI cables because "it feels" and "looks" better than inexpensive ones.

This is just one of the handful of disrespectful posts I found on this thread. I'm shocked to pieces. I understand if people disrespect someone like me, but to talk like this to a Neowin Supervisor is just beyond ridiculous.

Share this post


Link to post
Share on other sites
TheTempestSonata    4

It's a forum... If I was banned for stating my beliefs, oh well... it's a web forum.

Share this post


Link to post
Share on other sites
Rohdekill    775
I honestly think that's more about Best Buy's integrity than Monster's

And I thought I was the only one that fully read the initial post. This has nothing to do with Monster. And how the poster even came up with that concept is beyond me.

Best Buy is to blame and the sales guys will ALWAYS point the customer at the product with the highest profit percentage...especially add-ons since they have the highest markup.

Share this post


Link to post
Share on other sites
Lexcyn    88

ttp://www.youtube.com/watch?v=tDw2ZSDzlMw

I still think this should end this discussion. This was tested with professional audio video equipment. Sure, the $12 cable may fail faster, and the Monster may have a lifetime guarantee, but if you can buy another cable for $12, would you rather spend $24 or $260?

Just as an aside, I own 3 HDMI cables, and they are connected to a 52" LCDTV. One for HD Cable, one for PS3, and one for XBOX360. One cable is a Belkin, the other two are no-name cables purchased from eBay. None have ever died, all have perfect images. To be safe, I've placed a ferrite EMI filter on all of these cables so there is no chance of any interference.

I honestly don't know why everyone is still arguing. Just buy what you want, it's up to you to research or fall for the trap.

Share this post


Link to post
Share on other sites
redvamp128    321

People used to ask what is the difference in battery cables. (I used to work in Auto Parts store) -- The only difference with most of the products is Brand Name--- Other than some will have more Shielding. However in most cases it does not affect picture quality... just may affect other devices though.

Share this post


Link to post
Share on other sites
zivan56    22
So DVDs, HDDVDs and BD are what?

You are arguing that they are themselves "digital devices." They are in fact by themselves analog...and all use FEC, so the error rate is not an issue unless it is over a certain percentage. A HDMI cable would not make a difference in such as case, as it transports the video signal after it is converted to digital unless the cable has a bad connector or wire (or signal is degraded by distance).

Share this post


Link to post
Share on other sites
Glowstick    3

Going by that, hard disks and RAM are also analog. You're kind of making a silly case here. The data is stored digitally on it, so it's a digital medium. The last I remember is that it's all about pits and bumps and not sine waves.

Share this post


Link to post
Share on other sites
FiB3R    1,663
Any audiophile would laugh in your face if you paid $5000 for a Bose system.

Well considering that their top end consumer system only costs $3,999 I suppose that would be pretty funny.

But then Audiophiles would spend that amount on a cartridge.

any audiophile will laugh in your face if you have anything BOSE -

BOSE are for people who dont know any better - kinda like Monster Cables

No highs ? No lows ? must be BOSE

Yep.

But wow! I didn't know that BOSE was so universally regarded as being crap. I had a little search and it definitely seems to be the general consensus. I was under the impression that they made quality products.

But surely they're not as big a rip off as MONSTER?

Share this post


Link to post
Share on other sites
+Troll    67
Well considering that their top end consumer system only costs $3,999 I suppose that would be pretty funny.

But then Audiophiles would spend that amount on a cartridge.

But wow! I didn't know that BOSE was so universally regarded as being crap. I had a little search and it definitely seems to be the general consensus. I was under the impression that they made quality products.

But surely they're not as big a rip off as MONSTER?

They are both about equal, depending on how you look at it. I can get a system for about half of that $4000 system that is leaps and bounds better quality and sounding. A bit off topic, but yes it is the general consensus. The Bose argument is similar to building a PC or buying the generic HP or whatever from Best Buy etc...you can piece it together and customize it without having the name brand overhead for less.

Share this post


Link to post
Share on other sites
DaDude    46

Monster Cables definitely have good quality. The only problem is price. If you can afford them, great. So why put people down for buying them?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Recently Browsing   0 members

    No registered users viewing this page.