| Re .1
Note 1067 doesn't accurately explain the "advantages" or workings
of 18 bit D/A converters at all. The problem that they are supposed
to address is the increase in noise levels as the signal level drops.
One unique aspect of digital recording is thst the distortion levels
are highest when the signal is the lowest. If, for example, the
least significant bit represents a signal of 1 volt, the threshold
for setting this bit would be between .5 volt and 1.5 volt. This
leaves a range for a 50% error. As more bits are used to define
the signal, it is possible to more accurately approximate the actual
level.
In the "quasi" 18 bit machines, when the signal level is not using
the two most significant digits, the signal is shifted two places
and random "dither" signals are added to the two least significant
bits. It is known that the dither won't be exactly what was originally
intended, but the level of the distortion will be reduced due to
the random nature of the "dither".
The next part might not be exact, but it will describe the principle.
In order to properly match the signal levels of the loud and quite
passages, when the quite passages leave the D/A converter they are
dynamically reduced in level to return them to the proper level.
No reduction is applied to the louder passages since they were
not increased by the shifting in the beginning.
I hope that this has been clear enough to explain the principles
involved in getting 18 bits from a D/A when there are only 16 bits
on the disc.
|
|
There is a lengthy discussion of this technique in a note covering
Yahama CD players in the AUDIO notesfile. Whenever you have a
technical question about the hardware, you might look there first.
Most every topic you can imagine has been covered in detail by
experts.
P.S. the 18 bit technique has nothing to due with oversampling.
Also, I'm sure most folks know that there is only 16 bits available
on the CD itself. The 18 bits is built by the shifting mentioned
earlier.
Regards, Jim
|