[ibis-macro] concern on Rx_Noise

  • From: <fangyi_rao@xxxxxxxxxxx>
  • To: <ibis-macro@xxxxxxxxxxxxx>
  • Date: Mon, 20 May 2013 18:02:36 -0600

Experts;

While thinking about how to handle Rx_Noise in redriver, I came to realized 
that Rx_Noise's behavior in the downstream channel can't be defined. As 
described in BIRD 123, Rx_Noise is the RMS of a Gaussian voltage noise, and its 
unit is Volt. That information is inadequate to determine how the noise is 
transferred when passing through a filter because to compute the filtered noise 
one needs to know the input noise Power Spectral Density (PSD) in 
Volt/sqrt(Hz), but not RMS.

Take a simple example and consider a low pass filter that performs a time 
average on the signal within a window of 1UI. Assume N time points are used per 
UI. It's well known that the RMS of the average over N random noise samples is

(1/N) * Rx_Noise / sqrt(N)

where 1/N is the weight of each sample interval. Now the filter output noise 
depends ambiguously on the number of time points per UI, which is a simulation 
setting.

Am I missing something?

Fangyi

Other related posts: