[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: Tval test circuits



> Can anyone explain how the test circuits/loads for the Tval measurement
> were derived?
 
The test load for the 5V environment is the usual 50pF lumped load that has
been around since the TTL days.  It is a simple test circuit, and a
fair-to-poor approximation of several inches of trace and a few IC input
pins.

The 0pF load is used for minimum delays because capacitive loads tend to add
delay, so 0pF typically yields the smallest or worst-case (or best-case
depending on how you look at it) minimum Tval.

A problem with such lumped loads, is that real loads are transmission lines.
When the driven risetime was longer than the round-trip delay of the line,
the lumped load wasn't a bad approximation.  As risetimes got faster, or
lines longer, this approximation broke down.

In the other extreme, a trace looks to the driver more like a resistance
(equal to the line's characteristic impedance, Zo) to the line's previous
voltage.  When switching from low to high, a 70 ohm trace would look like a
70 ohm resistor to ground; and when switching from high (+5V) to low, the
trace would look like a 70 ohm resistor to +5V.  That is, the output current
is the same.  This approximation holds true for one round-trip delay of the
trace, until the signal bounces off the far end and returns to the driver.

So, when the PCI architects were adding 3.3V signaling to the PCI spec, they
used newer test loads that better measure the performance of the driver in
the first couple of nanoseconds as it switches.  The 25 ohm load represents
two 50 ohm (= minimum impedance) traces in parallel, which is what a driver
"sees" when it is positioned somewhere in the middle of the bus.

10pF is added in parallel with it, probably because test equipment always
has some stray capacitance that can never be made zero.

The minimum case (or slew rate) test load is a light load; again there's
that 10pF to account for the test equipment, in parallel with 500 ohms to
Vdd/2 for good measure (maybe it helps make the driver drive some minimum
amount of current).

Why only for 3.3V parts and not 5V parts too?  Probably because of momentum:
most of the IC industry was building 5V ICs then and using lumped loads in
their testers.  5V switching was on its way out, and faster IC processes
would migrate to lower switching voltages, so they left the 5V test load
alone and introduced the new load only for 3.3V signaling.