[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: 3.3V on a 5V bus




Let's say I'm designing a board to plug into a 5V PCI bus, but I have some
part that really *needs* 3.3 V.

Obviously, the spec requires me to put a regulator on my board to generate
3.3V from the 5V rails.

Am I supposed to connect "my" 3.3 V back out to the 3.3V pins at the edge
of my card ? (So that I can use the host computer's 3.3V regulator, if it
*is* present, which is bound to be better EMI shielded, have better
efficiency, have better cooling, etc.).

This is harmless if my regulator and a possible host regulator are the only
3.3V regulator(s) in the entire system, but I hear that current sharing
between multiple regulators is difficult to get right.

>>Date: Mon, 11 Nov 1996 09:55:17 +0100 (MET)
>>From: "Jens-Peter K. Jensen" <dantecmt/adm/432jpj@dantecmt.dk>
...
>> for a 5.5V bus with 4 PCI slots, what maximum current
>> should I expect to be drawn from the 3.3V

>Date: Mon, 11 Nov 96 08:16:08 EST
>From: Andy Ingraham <ingraham@wrksys.ENET.dec.com>
...
>This is a sticky question.
...
>Furthermore, section 4.3.4.1 in the PCI spec says that PCI expansion
>cards for use in the 5V bus, must generate their own 3.3V from one of
>the other supplies, due to the fact that the 3.3V supply might not be
>active.
>
>Therefore, in theory at least, the maximum 3.3V current you can expect
>to see, is zero.  Any card for a 5V bus that draws current from the
>3.3V supply, might be said to be technically non-compliant.
>
>The absolute maximum current would have been 7.6A per card, for cards
>at the 25W limit that drew all of it from the 3.3V supply.
>
>Regardless, you need to bypass the 3.3V pins near the connectors.
>
>Regards,
>Andy Ingraham

<a href="mailto:d.cary@ieee.org">David Cary</a>
<a href="http://www.rdrop.com/~cary/">Future Technology</a>.

!п