Determining the signal-to-noise, signal-to-distortion, or signal-to-noise-and-distortion ratio for the amplifiers and/or mixers of the future 5G transceiver channels is one of the big challenges for 5G related measurements. Accurate measurements can only be obtained if the components are used under realistic operation conditions (realistic signals, power supplies, and impedance levels).
These realistic measurement conditions require complex signal generators and digital acquisition channels to be used instead of the classical spectrum analyzers and noise-and-gain analyzers used in state-of-the-art measurements. The backside of these more advanced devices is that their dynamic range is quite a lot lower, resulting in the presence of a dominant quantization noise contribution that masks the noise and distortion contributions one needs to measure. This problem can be partly avoided switching the gain of the measurement channel, but then again this results in a variable quantization noise level that jeopardizes proper measurements of the signal to disturbance ratio during a power or impedance sweep.
You will tackle this challenging problem using advanced experiment design and noise modeling techniques that allow splitting the quantization noise and the distortion contributions. This new type of ‘calibration’ is expected to improve the measurement resolution of modern vector signal analyzers by an order of magnitude and will result in new opportunities for these advanced measurement devices.
Theory: 20%, Simulation: 30%, Measurements: 50%