PDA

View Full Version : Do digital cables really make a difference?



dave2010
07-03-2010, 09:28
I am convinced that different analogue cables, where needed, do sound different. But digital cables? Isn't this just snake oil stuff? In these boards there are claims for different optical cables, and sometimes using digital coax instead of optical etc. Why should it make a difference?

I accept that if a cable is very poor then there can be problems - perhaps particularly with copper cables, due to interference. With optical cables, then surely as long as the attenuation along the cable isnt significant there shouldn't really be any problem that I can see. Regarding the input/output circuits of the two different transmission modes, again I can't see why these should make a difference at all. If digital data is sent over an optical link, and also over a copper link, then if the bits are the same at the sender, then surely they should be the same at the receiver - subject to any errors along the way. Generally I'd expect errors to be lower with optical cables, but the error rate should be low with both types.

Perhaps I don't know enough about the SPDIF protocols to be sure though. Does it have error detection/correction? Does it ask for resends? No - not according to http://en.wikipedia.org/wiki/S/PDIF#Protocol_specifications which says "S/PDIF lacks flow control and retry facilities", The problem seems to be that the bit data rate is not known - "Because the receiver cannot control the data rate, it instead has to avoid bit slip by synchronising its conversion with the source clock. This means that S/PDIF cannot fully decouple the final signal from influence by the analogue characteristics of the source or the interconnect, even though the digital audio data can normally be transmitted without loss." which could result in jitter and possibly corruption of the analogue output as a result.

Biphase Mark Coding (BMC) is used for the data, and unless rapid and frequent changes to the encoding frequency are used, surely it should be possible to determine the clock rate well enough, and thus to maintain clock synchronicity at the receiver with good circuits.

I still find it hard to accept that different types of cable could be significant in influencing the sound, and that different cables of reasonable quality should sound different.

If they do, why?

DSJR
07-03-2010, 10:14
IIRC, back in the days when Paul Miller was the new dog's doo-dahs, he tested some cables and two box players and found that there could be reflections at rf frequencies in the cables. The fact that many DAC's of the period weren't true 75 Ohm connections (the Meridian 600 two box player was awful in this respect) made things worse I recall.

In a perfect world - and *domestic* digital technology is pretty well ok now I believe, the differences in cables designed for the job *should* be minimal (he says, hopefully)....

The thing is, some of the older digital technology like the ancient Philips chipsets so beloved of many, had all sorts of rf and ultra-sonic muck coming down the cables' screens so there wasn't *just* the digital signal one was dealing with.

The above is just an opinion and pragmatic memories, so should be regarded as such...;)

Themis
07-03-2010, 11:12
I still find it hard to accept that different types of cable could be significant in influencing the sound, and that different cables of reasonable quality should sound different.

If they do, why?
Because of that : http://en.wikipedia.org/wiki/Reflection_%28electrical%29

The Vinyl Adventure
07-03-2010, 13:14
I know this is isn't about hdmi but hdmi is also a digital cable so I guess it is worth pointing out. When we first got the ps3 in as a blueray player in the shop it had been reviewed as being better quality than the bdp s1 and bdp s300 (I think they were the current Sony players at the time but don't quote menom that) in terms of picture quality.
A new member of staff wired it in to a tv right next to one wired in with a dedicated player we switched it on and watched in horror as the ps3 churned out a picture with green/purple colour shift all over the place. We scoffed and commented how it was yet another thing what hifi had got wrong. A couple of days later I was behind the tv and noticed that it had been wire in with a hdmi cable I ha brought back from a customers house - one o the shitty ones sky use. Told jamie the store manager and he yanked it out and chucked it in the bin, replaced it with a shielded vivanco one and gone was the colour shift. On that occation we had to withdraw our complaints about the ps3, it was infact a better player than the current stand alone ... It had just been let down by a shit digital cable!
I'd love to hear an explanation for that one, do these refelections effect hdmi?

dave2010
07-03-2010, 14:22
Themis

Reflections are an interesting possibility for the differences noted. My first thoughts though were that the timings wouldn't work out. I still think that the timings should be OK, but note that with 96kHz 2 channel operation with 24 bits per sample, and using biphase mark coding, the period between transitions could be as small as (approx) 100 ns. A reflection travelling down and back a cable 5 metres long should take about 50 ns. It is therefore possible that this could cause problems, though it would depend on quite a few factors. If the reflections are significant then these would perhaps just represent a background noise level which the detector would have to differentiate from the wanted signal. Another problem would be that the PLL detectors might have diifficulty in locking on to the data clock. I would rather expect this to be an all or nothing thing - but then this is not always the case as we know from DAB radio, which can become marginal leading to bubbling mud noise.

If reflections are the problem, then presumably the first line of attack is to minimise these, which probably occcur at the end points, though can also occur within the cable. I've not seen it mentioned, but it occurs to me that having greater attenuation along the cable might also help as it would reduce the level of reflected signals to wanted ones. This would seem counter-intuitive to anyone used to thinking in terms of less attenuation. Timing problems would seem to be more significant in longer cables, though the ratio of wanted to unwanted signals may be better because of the attenuation effects.

If it really is an issue, then for optical fibre systems, would some form of optical gel at the joints improve things? This would have to have a refractive index close to that of the cables being joined (or maybe a lens if at the transducer), and preferably cables being joined should have the same refractive index.

It should be possible to test all this, to see what the error rates really are by transmitting a known signal pattern across a link, and seeing how often errors occur. The rule of thumb I've generally taken for optical links is that the BER is around or less than one bit/10^8 bits, which would perhaps imply a 1 bit error every 11 seconds in a 24/96 2 channel bit stream encoded using BMC. Depending on where the error occurred, this might not be very significant, and additionally should be above the range of human hearing.

DSJR
07-03-2010, 15:46
I personally found that using a very low loss 75 ohm cable designed for linear transmission of UHF signals tends to minimise any differences. RS used to stock a range of good cables and as long as the phono plugs were low capacitance (not all were), all seemed well.

dave2010
07-03-2010, 16:01
hamish

Perfectly happy to allow HDMI in. It may be the way forward for audio - who knows? I don't know too much about HDMI cables. There could be more snake oil there too, but I suspect there are poor ones such as you describe, which are just a waste of time. Allegedly one of the best only costs about £5.

At the sort of data rates most of these cables run at, the protocols to use nearly all work without resends I think, and I suspect that they don't use FEC either. For audio work it would certainly be possible to use protocols with error correction codes, and FEC in the receiving device, but I don't think that's what happens. I'd love to be corrected on this. In this respect they differ from USB which does run with more complex protocols, and I think allows resends. However more complex protocols themselves cause problems in real time work and multimedia. If the resend request cannot be handled in time, then either playback rendering has to resort to interpolation or error correction at the playback device, or sometimes playback may just lock up altogether, along with the communications activity across the link.

I've certainly seen this happen with a USB cable, so that indicates that errors can occur on copper links working at up to 400 MHz. HDMI works at up to around 300 MHz for HD distribution so I'd say that errors can occur, and would be disruptive. You seem to have had practical experience of this. Whether with a better cable there would still be errors which could affect quality I'm not sure.

I'm not at present sure about the physical construction of an HDMI cable. It may have groups of wires, and the reason your wire failed may have been because one of the groups (presumably affecting colour) did not function correctly. There are also moves towards having active HDMI cables which reduce the need for so much copper content - see RedMere.

DSJR
07-03-2010, 16:12
USB isn't best for audio as I understand, but computer data is treated far better than CD audio, which may explain why ripped and HDD stored music can be better than a straight CD player.

Themis
07-03-2010, 17:40
Themis

Reflections are an interesting possibility for the differences noted. My first thoughts though were that the timings wouldn't work out. I still think that the timings should be OK, but note that with 96kHz 2 channel operation with 24 bits per sample, and using biphase mark coding, the period between transitions could be as small as (approx) 100 ns. A reflection travelling down and back a cable 5 metres long should take about 50 ns. It is therefore possible that this could cause problems, though it would depend on quite a few factors. If the reflections are significant then these would perhaps just represent a background noise level which the detector would have to differentiate from the wanted signal. Another problem would be that the PLL detectors might have diifficulty in locking on to the data clock. I would rather expect this to be an all or nothing thing - but then this is not always the case as we know from DAB radio, which can become marginal leading to bubbling mud noise.

If reflections are the problem, then presumably the first line of attack is to minimise these, which probably occcur at the end points, though can also occur within the cable. I've not seen it mentioned, but it occurs to me that having greater attenuation along the cable might also help as it would reduce the level of reflected signals to wanted ones. This would seem counter-intuitive to anyone used to thinking in terms of less attenuation. Timing problems would seem to be more significant in longer cables, though the ratio of wanted to unwanted signals may be better because of the attenuation effects.

If it really is an issue, then for optical fibre systems, would some form of optical gel at the joints improve things? This would have to have a refractive index close to that of the cables being joined (or maybe a lens if at the transducer), and preferably cables being joined should have the same refractive index.

It should be possible to test all this, to see what the error rates really are by transmitting a known signal pattern across a link, and seeing how often errors occur. The rule of thumb I've generally taken for optical links is that the BER is around or less than one bit/10^8 bits, which would perhaps imply a 1 bit error every 11 seconds in a 24/96 2 channel bit stream encoded using BMC. Depending on where the error occurred, this might not be very significant, and additionally should be above the range of human hearing.
In fact, reflections are not a problem in itself. I mean: reflections alone. It's their messing with the receiver that counts.
Things become more or less worse depending on the transport's rise-time (typically 25ns). If the cable length is 3 feet, then the propagation time is about 6 nanoseconds. Once the transition arrives at the receiver, the reflection propagates back and then the emitter reflects this back to the receiver (total ~= 12 nanoseconds). So, as seen at the receiver, 12 nanoseconds after the 25 nanosecond transition started, we have a reflection superimposing on the transition. This is right about the time that the receiver will try to sample the transition, at about 0 volts DC. This is a problem.
I think a better explanation (than mine, as I'm not a specialist) can be found there : http://www.positive-feedback.com/Issue14/spdif.htm

dave2010
08-03-2010, 10:41
USB isn't best for audio as I understand, but computer data is treated far better than CD audio, which may explain why ripped and HDD stored music can be better than a straight CD player.I didn't know too much about this until a year or so ago. USB does have protocols to ensure the data is good. Many computer network protocols do something called stop and wait, in which the data is checked, and a request for resend is sent if it's not. There are additional variants in whch more data can be sent before resend requests are honoured. Most networks now work with asynchronous data packets, and the error checking can be very thorough - much better than a simple parity check. For multimedia use it's also possible to include redundant data, and do error correction at the receiver, which reduces the need for the resends. This is particularly useful for multicast streaming.

Because of the checking and the possibility of resends, timing can be messed up rather a lot if these protocols are used to a device which doesn't have enough input buffering.

The feeds into DACs, with the exception of any with USB inputs, are essentially synchronous data links. Any errors cannot be corrected, and no requests for resend can be handled - this would in any case slow up the process, and put glitches in the sound.

How a particular DAC with USB input handles audio will depend on the drivers used, how the data is coded up at the sender, and how the receiver at the DAC decodes the data. The data should be high integrity with good drivers, but there will be a delay across the link if sufficient buffering is used to ensure that any errors can be corrected to a high degree. Several of the currently available DACs only support 16 bit audio via their USB input, and the sampling rate may be limited to 48 kHz or 96 kHz, which is good enough for most CDs, but not for higher quality sources, such as some master level recordings which are available in 24 bit/192kHz formats.


ripped and HDD stored music can be better than a straight CD player The data handling and communications can be better, but to get the sound there still needs to be an output device. This may often be via a DAC, and it should be possible to drive a DAC from a CD player to give very good results. It should not be theoretically possible to do any better using a computer, as the data is exactly the same. If you are referring to the analogue outputs of a CD player, then it is possible that a computer output could be better. Things sometimes fall down in the detail though, and jitter in the output from a CD or DVD transport can spoil the sound if not corrected. Some DACs can do this, and of course a better CD or DVD transport would avoid the problem.

Shifting from SPDIF to USB moves the problem. If the input to a DAC is limited to 16 bit, then this can affect quality. For a CD source ripped to computer files, there should be no loss of resolution, as the data starts out as 16 bit. There could be a loss of resolution with an HDCD, as this is 20 bit encoded data, and similar considerations apply to DVD-A where 24 bit resolution may be available. Some compressed music files, such as mp3, aac, may have an effective bit resolution greater than 16 bits, depending on how they wre made, which is perhaps why they don't always sound so good if converted to CD. SACD is generally a pain because most players block the SPDIF output when playing SACDs, though it is possible to get an output via HDMI. It may be that some players can convert SACD output to LPCM but usually they block the digital output due to the madness of the concerned music industry ensuring the format will fail due to more concern about the possibility of music being ripped off by rogues, rather than legitimate listeners being able to hook up equipment to get it all working. A real own goal there for a format which seems likely to die anyway.

Regarding the possible limitations on sampling frequency for USB inputs to DACs, for normal use an upper limit of 48kHZ should not present problems, but it may be that better results could still be obtained with higher sampling rates. Some DVD-A discs recorded at 192/24 ripped to computer files would have to be downsampled in order to work.

Whether the limitations of USB inputs to DACs at present are a real problem I'm not sure. The results with DACs such as the DACMagic and Beresford Caiman don't seem to be vastly inferiior - though there is a lot of subjectivity there. I am however inclined to believe that at the moment a good SPDIF feed still beats the USB, but future systems will probably improve on that with USB and HDMI inputs.

Haselsh1
08-03-2010, 20:51
Tizz funny this thread should come up at this point in time. This very evening I have been swapping between a 75 Ohm coax and an optical cable into my new Caiman. I have been listening to Tina May and Cassandra Wilson and the difference between the two types of interconnect is quite distinct. I prefer, by a long way, the 75 Ohm coax as it displays a much better depth and space to the sound and has no dropouts unlike the optical cable. My decision is now made... ditch the optical cable in my system.

Stratmangler
08-03-2010, 20:58
Shaun

I'm wondering if there is a problem with your toslink connection, seeing as you suffer from dropouts.
I use a 5m toslink to connect the Sky+ box to the Caiman, and it works faultlessly.

Chris:)

Haselsh1
08-03-2010, 21:12
Strat, I suspect it's cheap cable syndrome as I bought them both just to see if there was a difference. Now I have decided on the coax I shall buy a very decent cable from that guy who's talked about a lot on here... ah, I remember, Mark Grant...! Nearly forgot.

Themis
08-03-2010, 21:17
MG coax -> perfect choice ! ;)

Stratmangler
08-03-2010, 21:18
Are you talking one of these (http://markgrantcables.co.uk/shop/index.php?main_page=product_info&cPath=40_3&products_id=3) or one of these (http://markgrantcables.co.uk/shop/index.php?main_page=product_info&cPath=40_3&products_id=183).

They both utilise crimped connections.

Chris:)

Themis
08-03-2010, 21:20
I prefer Belden. ;) Didn't test the others.

Haselsh1
09-03-2010, 10:18
I was talking about the G1000HD cable at thirty odd quid. Is the cheaper cable better...?

Stratmangler
09-03-2010, 11:05
I've not used either Shaun, so I can't comment.
Both cables utilise the same connectors and termination method, and the spec of the Belden cable is very good. From what I've read off the MGC website the spec of the G100HD is also very good.

I think that this comment is very telling - I lifted it from the MGC website.

The only way to know if you will like how this works in your system is to try it at home, you are welcome to return for refund within 30 days any cables you decide not to keep.

No G1000HD cables have ever been returned so far.


Chris:)

technobear
09-03-2010, 23:01
I've not used either Shaun, so I can't comment.
Both cables utilise the same connectors and termination method, and the spec of the Belden cable is very good. From what I've read off the MGC website the spec of the G100HD is also very good.

I think that this comment is very telling - I lifted it from the MGC website.



Chris:)

Is that still true? Didn't Jerry send some back? Or did he sell them on? :scratch:

Stratmangler
10-03-2010, 10:43
Is that still true? Didn't Jerry send some back? Or did he sell them on? :scratch:

I've just trawled through the G1000HD thread, and you're right, Jerry didn't like 'em enough to keep 'em.
However, they were not returned to Mark Grant - they were moved on to one of the members here.

Chris:)