Hi all, First post here. I have a set up with a 100 ohm 0.1% resistor and after calibration i go to make a measurement of my test lead resistance. I consistently get a flat line value of 3.2 ohms. This doesn't make much sense. I really want to use REW for calculating T/S parameters. Does anyone have any insight on this?