TDRing Digital Cables - Tributaries SCA-D

House de Kris

Loud-n-Deep
In another thread, comments were made about 'better' digital interconnects being closer to the ideal 75ohm characteristic impedance and that some 'audiophile' cables are not necessarily all that close. I kinda offered to measure cables for people, then measured a video cable that came packed in with a VCR. It measured 50ohms and that got me real curious. So, at lunch today, I ran over to the local audio shop and picked up a couple "real" digital cables to see how close to idea they are.

The Tributaries SCA-D (1 meter) package states in bold letters right on the front "True 75ohm Tip-to-Tip performance" "Silver-plated signal conductor" "Teflon dielectric" & "triple shielded." My price was $125, I don't know if this is typical street price or not.

TDR picture is below. The cable has a max value of 66ohms and a min of 63ohms. I think it would be a stretch to honestly call this a 75ohm cable. At least it is relatively consistent through the length of the cable. The minimum appears to be due to physical variance in the construction of the cable. Also, skin effect appears to be minimum.

I'll leave it at that, since this thread is intended to only deal with the characteristic impedance of interconnects rather than sound.
 

Attachments

  • TributariesSCA-D.GIF
    TributariesSCA-D.GIF
    10.1 KB · Views: 126
Last edited:
Measure other cables?

Hi House de Kris,

Would be curious if you could measure a couple cables from www.bluejeans cable.com ... an RCA-RCA with Belden 1694a as well as a BNC-BNC version of the same.

They use only Canare 75-ohm connectors and the 1694a is supposed to be great stuff. These Canares are the ones made to fit 1694a.

If you had the time and desire, would love to see you report on their Canare connector-ed L5CFB cable, too.

Can you measure other characteristics, too?

Thanks,

Dave
 
Iv'e got the time and desire to test more cables, but I actually don't feel like shopping for and buying more cables. Feel like sending me any samples?

As far as other tests, yes I think TDT (transmission) would be better for what we are interested in, rather than TDR (reflection). I will modify my test setup to do this measurement as well.
 
My one proablem with measuring the characteristic impedcance of a cable like that is that to get an accurate mesaurement, I believe that you really need to use a significantly longer sample.

I have done some measurments of 50 Ohm coax is the past, and I discovered that the longer the lenght of the cable that I tested, the closer the measured impedance came to the stated characterist impedance of the cable.
 
goldear said:
My one proablem with measuring the characteristic impedcance of a cable like that is that to get an accurate mesaurement, I believe that you really need to use a significantly longer sample.

I have done some measurments of 50 Ohm coax is the past, and I discovered that the longer the lenght of the cable that I tested, the closer the measured impedance came to the stated characterist impedance of the cable.
I'm curious, as I've read this elsewhere. But I've never seen or heard anyone say what a "correct" length is. So, just what length is "significantly longer" and correct or better? 6 feet? 20 feet? I believe House de Kris is testing cables that are about 1 m in length.
 
I've got a couple guys in my local audio group that SWEAR by having their Digital Coax at LEAST 8 ft long.
 
goldear said:
My one proablem with measuring the characteristic impedcance of a cable like that is that to get an accurate mesaurement, I believe that you really need to use a significantly longer sample.

I have done some measurments of 50 Ohm coax is the past, and I discovered that the longer the lenght of the cable that I tested, the closer the measured impedance came to the stated characterist impedance of the cable.

I am using the TDR method of measuring these cables. Your method (of requiring long lengths) sounds suspicously like the SWR method. The TDR method uses a step generator with a fast edge (about 30ps risetime) and looks at all the reflected energy as seen from the sending source. As such, the longer the line, the less accuracy you get. For example, if you look at the picture in the MIT thread, it starts at 75ohms and rises to what appears to be 80ohms at the end of the line - in just 1 meter. In reality, the impedance doesn't really rise, but the losses due to skin effect make it more difficult to see the impedance accurately and it (the skin effect) adds to the characteristic impedance of the line.

The SWR method gives just an aggregate value for the whole line. With the SWR method, it is impossible to measure specific impedances at specific locations in the line. Thus, the SWR method cannot measure the impedance consistancy along the line or if there is any small physical damages to the cable.

If my assuption of your test methods is incorrect and you are using TDR, then our experiences are not the same. With TDR, the longer the line the harder it is to measure accurately.
 
Last edited:
DKak said:
I'm curious, as I've read this elsewhere. But I've never seen or heard anyone say what a "correct" length is. So, just what length is "significantly longer" and correct or better? 6 feet? 20 feet? I believe House de Kris is testing cables that are about 1 m in length.
To add on to my own thought here, my (naive?) assumption is that less wire is better. So in my mind, if you have two wires, both near-perfect 75-ohm transmission lines of sufficient bandwidth, the shorter cable should be better.

Please enlighten me if I am incorrect.
 
Great thread, and thanks for the efforts, Kris!

I'm still confused though. What, exactly, is the 75 ohm standard, and why does it matter that the cable conform to it closely?

Is the impedence supposed to be 75 ohms only after a certain length? Or is each cable supposed to be 75 ohms, no matter what length it is?
 
DKak said:
To add on to my own thought here, my (naive?) assumption is that less wire is better. So in my mind, if you have two wires, both near-perfect 75-ohm transmission lines of sufficient bandwidth, the shorter cable should be better.

Please enlighten me if I am incorrect.

In my opinion, you are correct. When dealing with a controlled impedance network (which we are), and if we assume it is properly designed and implemented, all energy from the source should be 100% absorbed by the destination. This would result in no reflections. This would remove all jitter due to reflections. Jitter could still exist, but it would be due to other mechanisms.

But, there is some Internet wisdom floating around out there that suggests all digital interconnects should be a minimum of 1.5 meters or somesuch (I don't recall the exact magic number). I read someone's writing on this over at AA once and without paying very close attention and without checking the numbers myself, it appeared to be of good clean thoughts. But, it was based entirely on how to make sure reflections don't mess with edges. Remember, if done right there would be no reflections at all. In the real world though, it is purdy durn hard to get a 75ohm RCA connector. Some claim to make them, but I've never touched one. So it is a fair assumption to believe that it is indeed impossible to properly implement a quality digital interface. Still, I would think it would be difficult to predict the magic length to have for all cables since propagation through the cable is greatly influenced by the dieletric used.

After all that, I'm left with the standard audio axiom: use whatever works best for you.
 
Negotiableterms said:
Great thread, and thanks for the efforts, Kris!

I'm still confused though. What, exactly, is the 75 ohm standard, and why does it matter that the cable conform to it closely?

Is the impedence supposed to be 75 ohms only after a certain length? Or is each cable supposed to be 75 ohms, no matter what length it is?

The 75ohm characteristic impedance of a cable or PC trace is determined entirely by the physical makeup of the transmission line. Thus, it is 75ohms no matter how long.

You've asked a really good question about why does it even matter. So far this thread has only fixated on merits of the digital interface alone, not how it plays into the audio experience. The reason we strive to make the digital interface as good as we can is to minimize any degrading effects it may have on the digital stream that passes through it. Since the digital audio interface carries a clock in addition to data, the idea is to minimize any errors in extracting this clock. This may or may not be all that beneficial to some people. This is due to variations in the receiving equipment, some is very sensitive to incoming clock jitter, and some gear is pretty much immune to any interface clock jitter.

Just knowing that you've got a perfect 75ohm cable is no guarantee that the system as a whole will sound beautiful.
 
I was able to spend a few spare moments in the lab today and did some more digital audio cable measurements. This time around, I used a network analyzer to get the frequency domain viewpoint. About the only useful data I could collect from this is the S21 parameter, or forward transmission. Basically, all we get from that is the bandwidth, or frequency response, of the cable. If nothing else, this just gives another comparison metric.

My choice of network analyzers was between a 5Hz-200MHz or 50MHz-20GHz model. Since the older 200MHz one didn't save pictures of its measurements I opted for the super-slick fast one. Initially, my main concern was, "would 50MHz be low enough to measure these cables?" Thinking about the SPDIF data rates playing CDs (2.8MHz), it might be marginal. All depends on how square you want the edges in the pulse train. A good rule-of-thumb bare minimum bandwidth requirement to get nearly passible square waves is 5x the baseband. That, though, only gets us up to 14MHz bandwidth requirement. But hey, for comparison puproses, let's just see what is passing through cables above 50MHz.

Without futher ado, the measurment picture is attached. I set the mid screen reference to -3dB to make it easy to see the -3dB point. Also, I've got the scale set to 3dB/div and put markers (1) at the -3dB point, (2) at the -10dB point, and (3) well, I don't know what I was marking with number 3. The markers were set with smoothing on to find the real -3 and -10dB points, this picture is of raw data. Note that the span is only from 50MHz to 1GHz. If we define bandwidth as the -3dB point, then this would be called a 200MHz+ cable.

I'll post other cables some other day, I've got to go home now. Oh yeah, I found a better TDR picture than posted earlier in this thread and have added it here. It includes TDT measurements as well.
 

Attachments

  • TributariesSCA-D_FR_raw.JPG
    TributariesSCA-D_FR_raw.JPG
    54.7 KB · Views: 60
  • TributariesSCA-D_2.GIF
    TributariesSCA-D_2.GIF
    12.3 KB · Views: 57
House de Kris,

great work !

I just want to correct one minor thing:

Reflections in a 1m cable are arround 6ns, so you have to cope with frequencies arround 160 Mhz, rather than 14 MHz.
So the choice of the fast Network Analyzer was correct !

The edges of a SPDIF signal should be always indefinitely fast in order to avoid data induced jitter. Doing my own measurements/audition of differnt signals I found, that a rise time of 1 ns is not fast enough to avoid audible problems.

I built an digital interface with risetimes of 100ps, which seems to be enough for the DAC I used. Other, less jitter sensitive DACs might perhaps accept slower rise times.

If a risetime of 100ps is desireable and implemented in reality, the influence of reflections might be negligible, because they come way too late with a 1m cable.
This also depends on the delay/capture/rise time of the receiver circuit. I don´t have any idea to date, if the reflections can directly influence data induced jitter.

To summarize my thoughts:
The frequency range from 150 MHz to 10 GHz seems to be the most interesting !
Show me one "audiophile" cable, which works fine in this frequency range !
The only cables I know, to be sufficient there are semi rigid or rigid RF cables ...
 
Thanks for the comments, Unda Maris. To say that reflections in a 1m cable are about 6ns assumes a particular dielectric. The cable in this particular thread has a solid teflon dielectric, and has a forward transmission time of 4.9ns as shown in the second TDR picture. Thus, I'd assume a round-trip reflection to be in the 10ns time frame. Teflon foam dielectrics typically have a forward propation time of about 3.6ns for 1m, pretty close to your 6ns number for a round-trip time.

Good info on the driver risetime. Although it has been high on my list, I still have not gotten around to measuring the risetime on a typical CD player.

I encourage you to look through the other cable measurement threads I started here. In particular, if you desire wide bandwidth, check out the Belden 1694A or the Canare L5CFB. As discussed in the other threads, the biggest issue I ran into testing cables was the lack of a 75ohm BNC/RCA adapter. Just going through one of these would reduce the bandwidth to about 500MHz.
 
Just putting in my $0.02. I used to be the test lab supervisor of the R&D lab of the largest CATV connector manufacturer in the world. I have no affiliation with the cable manufacturers, but our results were that, hands down, the best cable was Comm/Scope with a hex crimped feed-thru connector. I can't name the connector, because I DO have an affiliation with the connector company at this time. Either the RG-59 or the RG-6 were fantastic cables, pick your amount of shielding, and if you go for more than 10 or 20 feet, use the RG-6, although for shorter runs, the RG-59 is usually a better electrical match to most off-the-shelf F-81 style connectors. The "boutique" cables couldn't hold a candle to this cable. Also, Times, Trilogy, and Belden made FINE cables which were in the same league as Comm/Scope.

You will find that the interconnect on the unit you're hooking the cable up to plays a significant role in the signal clarity.

FWIW.

rooster.
 
Back
Top Bottom