Unfortunately not as of yet no...
Bests, Mark
"We must believe in free will. We have no choice" Isaac Bashevis Singer
Another reason that digital cables sound different are signal reflections across the SPDIF interface, which cause mirrored images to 'bounce' back and forth along the cable & can make DACs think that timing bits are arriving at different times. SPDIF cables should really be minimum 1.5m in length and preferably 5m long to minimise the negative effects of signal reflections.
If you want the ultimate SPDIF then you need 75ohm BNC sockets and plugs. This is because as it is physically impossible for RCA plugs to provide a true 75ohm impedance, which means reflections.
Just my tuppenceworth, but my own opinion is that many differences between digital cables are NOT the result of reflections and jitter in the plugs. Many RCA's are said to be 50 Ohm, but in truth are between 37 and 50 Ohms. In the scheme of things, a slight deviation over a cm or so will have little if ANY bearing on jitter or reflections. Why? because the buffers and clocks in the DAC are dealing with bits of data, and for every "bit" from the transport, there are 8 other identical bits (loads of redundancy) so that the converted signal from the DAC is almost a perfectly reconstituted sine wave.
As Mark has already pointed out, capacitance can cause problems, and wtih analogue this can be in two areas: pase shift and (with a low receiving amp impedance) HF roll off. The phase differences in most signal cables are within 1 Radian and therefore only affect the harmonic series high up in the register with almost inaudible roll off or timing changes, all of which pale into insignificance with the phase shifts subsequently produced by your loudspeaker/room interaction so its never worth losing sleep over.
For digital signal transfer, cable impedance is critical as the cable constitutes the majority of the signal path. Cheap often poorly made cables (mostly from China but not exclusively so) may not have the tolerances between signal and shield (spacing) to guarantee exactly 75 Ohms. A half of a millimetre out of tolerance will throw the impedance off so that much of the redundancy is used up and the DAC then may have to interpolate missing data gaps. This is how it works. The 1.5m minimum does not apply to digital signals and is something of an urban myth. You can make them any length providing the cable is good quality and correctly specified. Increases in length capacitance do not have the same effect on HF roll off as it is NOT and analogue signal. Many non-hifi digital transfer 75Ohm cables in industry can be many 10s or even 100's of metres in length. the longer they are, the more critical the cable quality becomes because of the reasons already alluded to.