Vladimir, Ignore bits at the Tx is slightly different then ignore bits in the Rx. Both the Tx and the Rx may have their own Ignore_Bits requirement. Scott is absolutely correct that one needs to include the delay of the channel before one can analyze the output of the Rx. Adding in the correction for the channel delay, I would say that the ?simulation failed? in the sense that the .ami file is telling you that all should be ok after N2. Since, in your example, there are no clock ticks until after this time (corrected for channel delay), there is something wrong, and very possibly a bug in the DLL or a bug in the .ami file, or the channel and stimulus you have given causes the model to fail. I do not know how to design a CDR, and I do not know how to write a CDR simulator, but I do understand a little about clocks. As I understand it, an Rx has a clock generator in it that generates a clock that is close to but not necessarily the exact frequency of the Tx clock generator. The difference in these frequencies is normally reported in parts per million. I also understand that clocks always run, and that a CDR adjusts the phase of the clock. So I can imagine a model not ?locking? on to the data, and therefore the decision point not being in the center of the eye, but I would still expect to see clock ticks at roughly the bit-time period. I cannot image designing a CDR or writing a CDR simulator that is incapable of returning clock ticks at what the receiver thinks the clock period is, but I am just an eight hundred pound gorilla in the room. Walter Walter Katz 303.449-2308 Mobile 720.333-1107 wkatz@xxxxxxxxxx www.sisoft.com -----Original Message----- From: ibis-macro-bounce@xxxxxxxxxxxxx [mailto:ibis-macro-bounce@xxxxxxxxxxxxx]On Behalf Of Scott McMorrow Sent: Thursday, April 15, 2010 2:52 PM To: vladimir_dmitriev-zdorov@xxxxxxxxxx Cc: ibis-macro@xxxxxxxxxxxxx Subject: [ibis-macro] Re: Ignore bits & clock times Vladimir Well, the first question is really, what is a "bit". Is a bit a complete symbol of length Bit_Time where the first bit is: a) defined with respect to time=0 b) defined with respect to initial Tx differential crossing c) defined with respect to initial Rx differential crossing d) defined with respect to the initial clock_times value e) defined with respect to the EDA Tx stimulus f) defined with respect to the initial received bit arrival at the Rx Only Case e and f are unambiguous. But, case e doesn't make sense, because the receiver model has no idea long the channel delay might be at model development. In that case, f is the correct answer. However, the bit stream that is seen at the receiver is delayed by the transmitter and the channel, in which case, how do you know where the first received bit starts, since it may not include any transitions. The solution is for the EDA software to characterize the latency of the transmitter and channel, and add this to number of transmit bit times required before processing the receiver waveforms. If you accept this, then it does not matter that clock_times starts after the necessary Ignore_bits delay. This would mean that even after the correct delay period, the CDR has still not output a valid sample pulse. This is a bit error, and can happen when the received link amplitude is reduced to the point where the CDR does not lock. This is quite common in fiber and long copper cable interconnects. Scott Dmitriev-Zdorov, Vladimir wrote: Ignore_bits ?This value tells the EDA platform how many bits of the AMI_Getwave output should be ignored? If clock times start from the value N1*Bit_interval and Ignore_Bits is equal N2, what is the specified behavior in cases: 1. N1 < N2 (evident) 2. N1 > N2 (it appears that the first defined sampling point requires that more bits will not go into processing) Does the spec need to detail this? -- Scott McMorrow Teraspeed Consulting Group LLC 121 North River Drive Narragansett, RI 02882 (401) 284-1827 Business (401) 284-1840 Fax http://www.teraspeed.com Teraspeed® is the registered service mark of Teraspeed Consulting Group LLC