Home DVD-Audiobahn

New DVD-Audio music releases and talk about the latest players.

Coming closer to agreement, maybe

68.43.62.64

<< Please read the sentence carefully. I never said "no degradation", i said "the noise floor of the recording is limited by the noise floor of the source" - that means the noise floor of the recording will never be lower than -48dB. in actuality, it will be slightly higher than -48dB (but not significantly), because as you pointed out the A/D process itself will add some additional noise. >>

Good, we agree. Now just to restate my original point, the "additional noise" will be less with 24-bits than 16-bits (assuming the converter is capable of better than 16-bit performance). Therefore, the 24-bit version will be measurably superior regardless of the quality of the source.

In this exaggerated example, the difference between adding -48dB noise to -48dB noise (i.e. sampling an 8-bit source at 8-bits) and adding -120dB noise to -48dB noise (i.e. sampling an 8-bit source with a 24-bit converter with 120dB SNR) is significant.

<< But if you look at the majority of A/D converters out there (by reading their data sheets), you will find that they don't dither. The reason is simple - it costs more to implement dithering, and most of the time it's not beneficial. >>

I'm beginning to suspect that when you say "converter", you mean "converter IC"? I'm not surprised that the ICs don't dither. When I talk about converters dithering, I mean complete finished pro-grade products, not chips.

<< And rather than dither improving accuracy, dither is actually reducing accuracy (unless noise shaping is also implemented). >>

Dither removes distortion, which is very beneficial. (Once again, we're talking 16-bits. I agree you don't need synthetic dither at 24-bits, because the converter will self-dither). It doesn't matter whether or not noise shaping is involved.

<< You seem to think decimating directly to 16-bits somehow makes the converter "less accurate" - it doesn't, in fact, it's the best way of generating the most accurate 16-bit output. dithering will *reduce* S/N ratio - it adds noise, remember? >>

Forget about dithering for a moment. If a converter has (say) 24-bits of internal precision, why not use it, even for 16-bit output? It can only be beneficial. And it doesn't cost anything. So I'd be surprised if converter ICs worked the way you suppose.



This post is made possible by the generous support of people like you and our sponsors:
  Kimber Kable  


Follow Ups Full Thread
Follow Ups


You can not post to an archived thread.