reallystate
Registered
Thread Starter
- Joined
- Sep 23, 2025
- Posts
- 8
With identical FS sine Vrms values in the generator and RTA, the stepped measurement of the source does not differ significantly if you change "Generator" to "Input" in the lower right corner of the Distortion tab.
However, this is not the case with some IMD tests. The SMPTE, DIN, and AES-17 MD plots (all of which have a 4:1 signal ratio) show that the test reached the clipping limit (-1.7 dBFS) when "Generator" is selected. But if you switch to "Input," REW displays the test as if it started at a lower signal level and ended at -14 dBFS. The AES17 DFD shifts to the left by 3.2 dB when switched. The TDFD (Bass, akl, Phonto) shifts to the right by 3 dB. Meanwhile, switching has no effect at all on the CCIF.
At the same time, "Input" shows a noise floor value that is closer to the results of other tests.
Any idea what the reason could be?
However, this is not the case with some IMD tests. The SMPTE, DIN, and AES-17 MD plots (all of which have a 4:1 signal ratio) show that the test reached the clipping limit (-1.7 dBFS) when "Generator" is selected. But if you switch to "Input," REW displays the test as if it started at a lower signal level and ended at -14 dBFS. The AES17 DFD shifts to the left by 3.2 dB when switched. The TDFD (Bass, akl, Phonto) shifts to the right by 3 dB. Meanwhile, switching has no effect at all on the CCIF.
At the same time, "Input" shows a noise floor value that is closer to the results of other tests.
Any idea what the reason could be?