IMD Measurements: "Generator" vs "Input" Level Discrepancy

reallystate

Registered
Thread Starter
Joined
Sep 23, 2025
Posts
13
With identical FS sine Vrms values in the generator and RTA, the stepped measurement of the source does not differ significantly if you change "Generator" to "Input" in the lower right corner of the Distortion tab.

However, this is not the case with some IMD tests. The SMPTE, DIN, and AES-17 MD plots (all of which have a 4:1 signal ratio) show that the test reached the clipping limit (-1.7 dBFS) when "Generator" is selected. But if you switch to "Input," REW displays the test as if it started at a lower signal level and ended at -14 dBFS. The AES17 DFD shifts to the left by 3.2 dB when switched. The TDFD (Bass, akl, Phonto) shifts to the right by 3 dB. Meanwhile, switching has no effect at all on the CCIF.

At the same time, "Input" shows a noise floor value that is closer to the results of other tests.

Any idea what the reason could be?
 
These are measurements of the headphone DAC/Amp
SMPTE Generator.jpg
SMPTE Input.jpg
DIN Generator.jpg
DIN Input.jpg
AES17 DFD Generator.jpg
AES17 DFD Input.jpg
TDFD Bass Generator.jpg
TDFD Bass Input.jpg
THD vs level Noise Floor.jpg
 
Which REW version? The V5.40 beta builds got this fix in May:

Fixed: When stepped IMD measurements are plotted against the input level (rather than the generator level) the IMD reference tone level was used for the X axis instead of the total input rms level
 
I'm using version 5.31.3 on Linux.

When I measure IMD on Windows (using 5.40 beta 100), the problem disappears. Thank you. In the new version, both modes show a result identical to the Generator mode in the old version. This means that when measuring SMPTE, the TD+N and even the Noise Floor level are noticeably higher than when performing other tests (THD, CCIF, Multitone). Is this how it's supposed to be?

When I open an old measurement in the beta version, the problem persists. It seems I'll have to redo the old measurements.

I don't use the beta version on Linux because it's buggy. No matter which specific impulse response exported from REW I open in the program, the phase looks identical:
Phase wrapped.jpg
Phase unwrapped.jpg

Also, in this Linux version, if you open any impulse response, its frequency response should be at a level just below 0 dBFS, or 117 dB SPL, exactly as the old version shows. But beta 101 on Linux shows it at around -98 dBFS or 19 dB SPL.

These problems do not exist in the beta version for Windows.
 
I don't use the beta version on Linux because it's buggy.
The code is the same on all platforms, so differences are likely due to local settings such as whether you have chosen to display phase wrapped or unwrapped, whether the impulse response is plotted normalised and the input device SPL calibration.
 
When I click the info button, information about the previous file appears, every time. To display information about the current file, I need to scroll through it using the slider on the side.
 
The Info panel shows information for the current individual measurement selection, the one with the blue bar on an overlay graph like All SPL. You can change that by double-clicking on the measurement you want selected, or use Alt+up arrow/Alt+down arrow to go through the list or select one of the first 10 measurements by use Alt+number.
 
I think what you see in the video above is a bug. The information window shouldn't change depending on the slider's position.
 
Is it normal that 1:4 IMD Measurements (SMTPE, DIN) have much higher Noise level (+12.5 dB) comparing to other measurements (THD, Multitone, CCIF etc.) of the same device?
No, the noise floor measurements are very similar for me using the dBr axis and measuring using the current V5.40 build, using either generator or input as reference.
 
This looks like a bug. Noise (part of THD+N, TD+N) is much higher on SMPTE. This problem is repeated when measuring any device under any load.
 
files
 

Attachments

There are two things to consider.

Firstly, the reference for IMD distortion figures. That varies depending on the test signal, for DIN and SMPTE IMD it is the level of f2, which is 12 dB lower than f1 and slightly lower again than the total stimulus rms level. The "fundamental" in the distortion graph for those is the level of f2, so the normalised (dBr) view is showing levels relative to that, otherwise the displayed IMD figures would be wrong. For CCIF the reference level is f1+f2+IMD, which is pretty much f1+f2 or the total stimulus rms. Consequently the noise floor is being normalised to different levels in each case.

Secondly, what noise floor figure should be used? REW captures the spectrum of the noise. For stepped level THD measurements it uses the maximum noise level within an octave span centred on the the stimulus frequency. For IMD it uses the noise around f1, for MT it uses the noise around 1 kHz. However, since f1 varies between IMD measurement types that means different parts of the noise spectrum are being used in each case. I'm open to suggestions for alternative choices.
 
Back
Top