PDA

View Full Version : HiFi DACS with USB Vs Pro Audio DACS



swampy
08-04-2010, 18:12
Do any of the more popular HiFi Audiophile DACS with USB inputs like the ones mentioned on this forum come with properly written ASIO Low latency drivers ?

It seems that there is quite a difference between software for Audiophile / HiFi dacs (or lack of software) and pro-audio dacs. I think nearly all HiFi dacs state that windows supports the DAC via USB natively and no custom drivers are supplied. Now if you buy a good Pro-Audio DAC that includes a USB input among the usual others, they always come with custom written ASIO drivers and the very best like M-Audio / Tascam have very well written stable drivers that offer ultra low latency in 10's of mS and not the several 100mS that Native windows drivers offer. Bad drivers or lack of drivers would result in a failed product in the pro-audio world.

Anyone who has dabbled with a DAW / midi / PC Audio knows that the drivers are absolutely critical otherwise you get serious timing errors over USB and at worst pops and clicks in the audio stream. Yes, I know USB sucks and firewire is the better external method.

I know there are some 3rd party solutions for USB ASIO drivers about but these are poor and buggy when you do some digging esp when compared to those specially written for the product.

Is this why USB is not considered an option for Audiophile use ? If you have a laptop then besides streaming to a SB or PS3 what other direct connection methods are available.

I cannot get my head round why the difference. If one day I go PC audio route I would like an option for a direct connection from say laptop to DAC and then drivers become a problem. Maybe that is why I stick with CD and just burn any d/l's

dave2010
08-04-2010, 20:07
I don't think USB has to be bad, but it may depend how it's used. If a device, such as a dac, has enough buffering memory, then assuming that buffering delays aren't a problem for users, then clocking issues need not be important. It's possible that many dacs aren't designed that way. The fairly standard S/PDIF is basically a synchronous communications protocol for consumer use, based on AES/EBU systems. There is no option to resend data. Data is sent in blocks of 192 bits. See http://en.wikipedia.org/wiki/S/PDIF There will be fairly standard components for receiving S/PDIF data.

USB is much more complex. See http://en.wikipedia.org/wiki/USB It should be possible to use USB 2 or USB 3 (even bog standard USB might do) to keep buffers in a receiving device fairly full, and then a device such as a dac could playback the data at a rate controlled by its own clock. This might require device specific drivers and protocols, and this may be a problem for designers and vendors. Some designers may prefer to use standard components and keep away from software. In any case, putting processing cability in a device adds to complexity and cost, though some high end devices are expensive enough that might be worthwhile.

At present my understanding is that S/PDIF seems to be considered better, but there are now dacs which work with USB rather well, and it's quite possible that in a few years USB will be the preferred choice.

Vincent Kars
08-04-2010, 20:28
First of all what is the use of low latency?
This is most of all a problem for the musicians, if the delay it to big they start to make timing errors.
ASIO is first of all developed (by Steinberg) for recording purposes as the Win audio at that time had a far to high latency.
The second advantage is that it talks straight to the soundcard so no OS mixer or SRC (K-mixer!) is kicking in.
Today Windows WASAPI (Vista and higher) does the same.

USB comes in 3 flavours:
• Synchronous is very prone to jitter.
• Adaptive mode is less jitter prone and used by most of today's USB audio devices.
• Asynchronous mode, the audio device does the clocking. It tells the PC to regulate its speed. As the clock is running on the audio device this solution can be implemented in such a way that the DAC receives a signal with a very low jitter.
Although asynchronous is considered best, the result depends very much on the quality of the clock at the DAC side. Optocouplers are also used to shield the DAC from the noise on the USB bus.
http://thewelltemperedcomputer.com/KB/BitPerfectJitter.htm

You might have a look at products like the Ayre QB9

swampy
08-04-2010, 21:17
First of all what is the use of low latency?


Well originally for sampling vocals / instruments into a PC/MAC DAW via a ADC I guess... which can then be recorded and/or processed real-time by a VST in software before sending the resulting master track back out though the same DAC box to amp and speakers for monitoring. So it needed to be very fast as you say to stop timing problems as it was a 2 way process.

The reason for the original question was I had a USB based dac recently on loan (not mine) and connection to the PC via USB was very poor quality, scratchy, odd pops and clicks. Sounded like typical timing problems you get in a DAW setup with bad or no drivers. Win7 picked the DAC up fine using 'native' drivers. The same DAC Via Coax from CDP sounded vastly better. Night and Day difference. I run Linux as my main OS but did not try it on that. Not sure how Linux copes with USB DAC's but Linux does have an API called Jack which is for low latency audio.

I have just noticed that some of the new USB to Coax devices like Hiface supply custom win drivers on their website so maybe manufacturers are catching on.

I don't know much about WASAPI.

Thanks for the info re USB modes.


Found this interesting info...

http://www.audioasylum.com/forums/pcaudio/messages/7719.html

and this DAC.
http://www.computeraudiophile.com/content/Wavelength-Audio-Proton-Asynchronous-USB-DAC-Review

So really we want a USB DAC with Asynchronous mode ?
Do those USB to RCA adaptors solve any of these issues with USB ?

Vincent Kars
08-04-2010, 21:34
scratchy, odd pops and clicks.

This is a common complain, shared band width=limited bandwidth.
Unwittingly you might connected the USB DAC and the screen or the mouse to the same internal USB hub. Using the right port can make a difference: http://thewelltemperedcomputer.com/SW/Windows/Win7/USBDAC.htm

Other culprits are anti virus software polling the internet at high priority, etc.
The performance monitor and DPC Latency Checker are valuable tools in these cases.

My knowledge of Linux is very limited but you can get a DAC to work simply using ALSA: http://thewelltemperedcomputer.com/Linux/USBAudio.htm

swampy
08-04-2010, 22:21
Just found this about a pro audio DAC (E-Emu).

"Drivers use reliable streams with retransmission. What this means is that similar to async USB the clock resides in the device and the PC is simply a data source slaved to that clock under the control of the driver. No worries about jitter."

So maybe those custom drivers are doing something else and not just for ASIO ? A bit beyond me. I think that is what I am trying to get at re the original question. Are custom drivers doing something else to get round the limitation of USB DAC / audio ?

dave2010
09-04-2010, 02:14
Well originally for sampling vocals / instruments into a PC/MAC DAW via a ADC I guess... which can then be recorded and/or processed real-time by a VST in software before sending the resulting master track back out though the same DAC box to amp and speakers for monitoring. So it needed to be very fast as you say to stop timing problems as it was a 2 way process.

The reason for the original question was I had a USB based dac recently on loan (not mine) and connection to the PC via USB was very poor quality, scratchy, odd pops and clicks. Sounded like typical timing problems you get in a DAW setup with bad or no drivers. Win7 picked the DAC up fine using 'native' drivers. The same DAC Via Coax from CDP sounded vastly better. Night and Day difference. I run Linux as my main OS but did not try it on that. Not sure how Linux copes with USB DAC's but Linux does have an API called Jack which is for low latency audio.

I have just noticed that some of the new USB to Coax devices like Hiface supply custom win drivers on their website so maybe manufacturers are catching on.
I wasn't expecting anyone to really worry too much aboout latency. Most consumers and end users probably won't mind a little start up delay, but the kind of applications you're considering may require latency to be reduced as much as possible. For example, adding in an extra track to a real live performance these issues could be really serious, though unusual. For a recording the extra track(s) could be offset relative to the other tracks providing the delays are consistent. I don't know how often people want to do this sort of thing, though I have heard of a few.

One in particular is in performances of Aida at the Royal Opera House, where the offstage brass in one scene used to be played by a group of players in a room over a pub on the other side of the road. I think they listened to the action, and also watched on video monitors, and they managed to time their entries - pretty much to perfection I recall. In truth I saw one of those productions and I thought the offstage band were great. It was only years later that I read that I was actually hearing a fairly hefty set of speakers replaying the sound of the live musicians. I think the pub has now been demolished as the ROH has been upgraded and took quite a lot of the surrounding land.

Another example would be in Saint Saen's organ symphony, in the event that it gets performed in a hall without an organ. This has been done quite a few times on recordings, though I've not heard of it being done live.

Re the cable issues with USB, I can well believe there are some. I have a Humax Freeview box, and it's supposed to be feasible to download video off to a computer. It can be pretty unreliable, and the quality of the cable, and the cable path can make a big difference. I think the problem with that device is the driver supplied by Humax, which doesn't handle errors well, and often locks up. With USB it should be possible to obtain perfect data transfer, by using resends, but that can be a problem in real time. It may be necessary to limit the number or resends, or use error correcting codes, or just tolerate (hopefully) occasional errors in the data for real time data transfer using USB.

Notwithstanding that, the fact that I was able to get completely different results on the transfer of files from my PVR to my computer with different cables and even by re-routing the cable - even slightly - indicated to me that errors do occur in the transfer through the USB cables, and without a strategy to handle these, there will inevitably be quality reductions in any audio data transferred in that way.

swampy
09-04-2010, 11:08
Is that why some peeps say usb cables make a difference ? Because of the way most HiFi usb dacs work in non Async mode they are more prone to jitter and data transmission problems so the usb cable would make a difference. Were-as in async mode the usb cable quality would it not matter as much ?

So are custom drivers supplied with all pro-audio DACS doing something else re data transmission ? ... were-as Audiophile DAC's don't generally come with custom drivers (maybe due to extra development costs) and instead rely on Native drivers and maybe the new WASAPI layer in Vista/Win7.

Vincent Kars
09-04-2010, 11:19
Maybe this link is of use: http://www.tech-pro.net/intro_usb.html

swampy
10-04-2010, 12:32
Ok after more digging there is much hype around asychronous USB converters in the HiFi World, as usual I guess.


It would seem that such converters are not rare in the Pro Audio Sector, they just don't make a big fuss of it, ie - Tascam US-144 ($149) is asynchronous, as well as the E-Mu 0404 ($199) as just 2 examples, both cheap and the Tascam is very well made by the looks.



http://www.tascam.com/products/us-144mkII.html


The "driver-less" converters (the ones that use built-in USB Audio drivers) are usually adaptive and require hardware programming to work asynchronously so most stick with the easier and cheaper methods. However in the pro-sector, where the manufacturer usually provides custom drivers, the asynchronous approach is far more common !!


So maybe a good quality pro-audio DAC is an option.

Vincent Kars
10-04-2010, 14:03
as just 2 examples

If you find more examples, let me know because these are the only 2 brands in the pro-sector I know making asynchrounous USB

Themis
10-04-2010, 22:15
Is that why some peeps say usb cables make a difference ? Because of the way most HiFi usb dacs work in non Async mode they are more prone to jitter and data transmission problems so the usb cable would make a difference. Were-as in async mode the usb cable quality would it not matter as much ?

I think there is a misunderstanding.

What most call "Asynchronous" USB is, in fact, an Isochronous transfer using Asynchronous Endpoint. Just like the "Adaptive" is an Isochronous transfer using Adaptive Endpoint.
In both cases (Asynchronous or Adaptive endpoints) an explicit synch mechanism is needed to maintain synchronization during transfers (a 3-byte packet).

The protocol stipulates that:

Asynchronous isochronous audio endpoints produce or consume data at a rate that is locked either to a clock external to the USB or to a free-running internal clock. These endpoints cannot be synchronized to a start of frame (SOF) or to any other clock in the USB domain.
and that:

Adaptive isochronous audio endpoints are able to source or sink data at any rate within their operating range. This implies that these endpoints must run an internal process that allows them to match their natural data rate to the data rate that is imposed at their interface.

In other words, the word "Asynchronous" in USB Audio has nothing to do with a buffer or any retry operation.
So, the transmission problems are identical.

technobear
11-04-2010, 20:13
Ok after more digging there is much hype around asychronous USB converters in the HiFi World, as usual I guess.


It would seem that such converters are not rare in the Pro Audio Sector, they just don't make a big fuss of it, ie - Tascam US-144 ($149) is asynchronous, as well as the E-Mu 0404 ($199) as just 2 examples, both cheap and the Tascam is very well made by the looks.

http://www.tascam.com/products/us-144mkII.html


Can you tell us where it is stated that this Tascam device uses asynchronous USB as I can find no mention of it on the page you linked to or in the user manual for the device.

Unless it is explicitly stated somewhere, you can assume that this is an adaptive USB device. The fact that it comes with a driver is to support other features that are not found on a generic USB sound card.

dave2010
11-04-2010, 21:42
I think there is a misunderstanding.

What most call "Asynchronous" USB is, in fact, an Isochronous transfer using Asynchronous Endpoint. Just like the "Adaptive" is an Isochronous transfer using Adaptive Endpoint.
In both cases (Asynchronous or Adaptive endpoints) an explicit synch mechanism is needed to maintain synchronization during transfers (a 3-byte packet).

The protocol stipulates that:

and that:


In other words, the word "Asynchronous" in USB Audio has nothing to do with a buffer or any retry operation.
So, the transmission problems are identical.I don't know all the details of the USB Audio 2 system, but I suspect that asynchronous does imply packet transmission, and the possibility of buffering with retries. This will give some latency. The retries would be necessary in order to cope with error situations. I have no idea whether the audio 2 spec. allows for error correction.

The playback rate at the receiver will either be under the control of a local clock, in which case there will be no jitter induced by the communications link, and handshaking between the sender and receiver will be required to keep the buffer sufficiently full and to avoid buffer overflow, or alternatively under the control of an external clock (at the sender), in which case the receiver will have to adjust its own clock rate to match.

Mac OS X systems use USB Audio 2 from 10.6 onwards. Windows systems seem more problematic. USB does allow for connection to real time operating systems, which permit closer control over the data flows, but I think that neither Mac OS X nor Windows fall into this category, so what happens in practice is somewhat of a compromise. It seems unlikely that we'll all be using real time OSs for this in the near future though.

Most implementations will presumably use the standard audio capabilities, though there is nothing to stop an innovative implementer using other transfer modes, and developing error detection, retries and error correction approaches. This would, however, mean that each data transfer system could then become device or vendor specific, which is something that the standards approach was trying to avoid.

See more at http://www.beyondlogic.org/usbnutshell/usb4.htm and/or http://users.ece.utexas.edu/~valvano/EE345M/view08.pdf

Themis
12-04-2010, 13:29
I don't know all the details of the USB Audio 2 system, but I suspect that asynchronous does imply packet transmission, and the possibility of buffering with retries. This will give some latency. The retries would be necessary in order to cope with error situations. I have no idea whether the audio 2 spec. allows for error correction.
No. No buffering and no error recovery (retries) exist in the USB Audio device protocol.

USB Audio is an Isochronous protocol. There's nothing "asynchronous" about it. When the sender's transmit rate is slaved to the client clock it is called "asynchronous". When it can be slaved to any clock in the USB range is called "adaptive".

USB 2.0 has no particular changes compared to 1.0: the main difference is the bandwidth, but the protocol is still the same.
Details ov V1 here : http://www.usb.org/developers/devclass_docs/audio10.pdf
and of V2 here : http://www.usb.org/developers/devclass_docs/Audio2.0_final.zip

Jitter is something more complex, as I see it. You have to understand that there's no such thing like "handshake" in Isochronous protocols. ;)
The only way of using "handshake" (and, thus, error recovery and retries) is using a Bulk transfer protocol (like the one used for transferring files, for instance). But, in this case, your device is no longer an "Audio" one, and you are creating a totally proprietary protocol.

dave2010
12-04-2010, 19:17
Dimitri

I bow to your superior knowledge in this matter.
No. No buffering and no error recovery (retries) exist in the USB Audio device protocol.

USB Audio is an Isochronous protocol. There's nothing "asynchronous" about it. When the sender's transmit rate is slaved to the client clock it is called "asynchronous". When it can be slaved to any clock in the USB range is called "adaptive".

USB 2.0 has no particular changes compared to 1.0: the main difference is the bandwidth, but the protocol is still the same.
Details ov V1 here : http://www.usb.org/developers/devclass_docs/audio10.pdf
and of V2 here : http://www.usb.org/developers/devclass_docs/Audio2.0_final.zip

Jitter is something more complex, as I see it. You have to understand that there's no such thing like "handshake" in Isochronous protocols. ;)
The only way of using "handshake" (and, thus, error recovery and retries) is using a Bulk transfer protocol (like the one used for transferring files, for instance). But, in this case, your device is no longer an "Audio" one, and you are creating a totally proprietary protocol.That would however imply to me that errors can occur, as I don't believe that USB cables are always that good, and the likelihood of errors must increase on long runs, and presumably there's no error correction either.

I can see ways of making this work, but as you say, that'd be with bulk protocols, and would then be wholly proprietary.

So, how good is the audio via USB - generally? I think we've all used it, but if we're worried about every dropped bit then this would give us something to chew on. Do receiving devices take any action to reduce possible problems? Maybe they don't have enough information to work on?

Do we really only have a choice between a few transmission systems, each with its own possible problems - S/PDIF (coax), Toslink (similar/optical) and USB? Maybe in the "real" world these are good eough for most purposes?

Vincent Kars
12-04-2010, 20:36
At one hand the USB audio is pretty much like SPDIF
- unidirectional (no error checking)
- real time

In case of SPDIF the send rate=sample rate (ok, 2 times as it is 'biphase-mark-code') so any clock jitter at the sender will translate into input jitter.
The USB bus itself runs at a fixed speed like any normal computer bus.
The transmission itself is therefore jitter free as the rate of the bus is not used to generate the sample rate.

However as it is real time, the DAC do has to sync itself with the speed of the incoming data (average data rate, adaptive mode).

A true error means bit flipping, the receiver reads a zero instead of 1 or visa versa.
I do think this is rare, a bit=6dB so this is in general clearly audible (pops and clicks).
In case of USB this is most of the time not the cable but all kind of activity at the PC site.

Ok lets forget these silly buses and go the Ethernet way.
TCP/IP is rock solid, bit perfect transmission guaranteed. If not, a retry.
Guess what: streaming audio in general uses UDP, a lightweight protocol. If a package is dropped, no retry! The assumption is that its occurrence is rare and if it happens, no harm done (it is audio only).

Now that I have proven that SPDIF is horrible stone age, USB only partly better and UDP has its limitations too therefore high quality digital audio is impossible, it is time for some practice.

I don’t think it all really matters. It is not the principle (SPDIF, USB, UDP) we hear but the implementation. My experience is that all are able to deliver good sound quality when implemented right.
In practice true transmission errors (anything happening to the signal on the cable) are rare.

BTW: Dimitri thanks for your contribution, it is the kind of technical insight I highly appreciate.

technobear
12-04-2010, 20:38
Do we really only have a choice between a few transmission systems, each with its own possible problems - S/PDIF (coax), Toslink (similar/optical) and USB? Maybe in the "real" world these are good eough for most purposes?

No there is also TCP/IP, a protocol which guarantees bit-perfect transfer.

I don't think I'll be swapping my Transporter for a USB DAC just yet :)

Themis
12-04-2010, 20:46
Well, I didn't mean to be negative about USB Audio. The protocol needed some time (several years, I think) to get final.

Yes, of course, there is error detection, like in most protocols. There are extra bytes used to identify altered bits (checksums and cyclic redundancy checks) and (hopefully) recover some lost ones. For USB it is called CRC-5-USB. (I can point to an article of the maths involved, if needed).

I believe that the protocol (and the cables) are ok for common (audio) use. If there are some (very rare) errors, the altered packets are recovered if there are not too many consecutive bits that have been altered. Otherwise they are discarded. Not a big deal if the error percentage is small, but let's bear in mind that audio artifacts can be easily detected by the listener.
In other words, if you detect no obvious artifacts while listening, then the errors are very few, or they have been all recovered.
Nevertheless, if a packet is discarded, it cannot be resend. What I mean is that USB is not a "perfect" protocol. But no audio protocols are "perfect" afaik: S/PDIF (or AES/EBU) is also a protocol which lacks resending of lost packets.

dave2010
12-04-2010, 21:35
Vincent

A true error means bit flipping, the receiver reads a zero instead of 1 or visa versa.
I do think this is rare, a bit=6dB so this is in general clearly audible (pops and clicks). That's not quite right. If the MSB is flipped the impact is far greater than if the LSB is corrupted. I'd expect errors to be quite rare, and often bits somewhere between the MSB and the LSB.

Dimitri

CRCs are usually used to detect errors, not correct them. It's not clear to me how USB handles the situation when an error is detected. If there''s no resend, then the usual options are to do some form of interpolation, or simply to repeat the previous packet.

Themis
13-04-2010, 14:50
CRCs are usually used to detect errors, not correct them. It's not clear to me how USB handles the situation when an error is detected. If there''s no resend, then the usual options are to do some form of interpolation, or simply to repeat the previous packet.
http://www.cs.nmsu.edu/~pfeiffer/classes/573/notes/ecc.html
I hope it's not too complicated... :o

Otherwise, there is some (simpler) info here : http://en.wikipedia.org/wiki/Error_detection_and_correction

USB uses a CRC-5 type.

dave2010
17-04-2010, 02:00
http://www.cs.nmsu.edu/~pfeiffer/classes/573/notes/ecc.html
I hope it's not too complicated... :o

Otherwise, there is some (simpler) info here : http://en.wikipedia.org/wiki/Error_detection_and_correction

USB uses a CRC-5 type.I think I'm sticking with what I wrote earlier. If you want to do forward error correction then a code such as a Reed-Solomon code is needed. I don't think the CRC-5 coding is going to work for correction, and I'm unaware that USB uses any really complex codes to support FEC. It would be possible to use a k out of n block erasure code, with the CRC used on each block to decide whether to accept it or not, but isn't that too complex for the particular application? If the error rate is not excessively high, it should be possible to implement an FEC above the USB level, but that would require (probably) device specific code. My contention is that USB does not do this.