[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: How to power a PCI card from either 3.3V OR 5V..
The same reason it was stupid to not require 3.3V supply on PMC
slots in the first place, 3.3 Volt regulators dump a lot of heat.
Like, if you need 5A @ 3.3V (16.5W, well less than the 25W max)
then the regulator will be dissipating 8.5W, then the total is
25W. If the 3.3V can be used instead of regulating the 5V down
that would be a tremendous thermal savings, increasing the range
of operation and reliability greatly. Putting a switch on a PMC
board to use 3.3V if available makes sense. Yes, it is a hassle
and can have sequencing, etc. problems. Probably easier just to
use a switching regulator, which still wastes heat, takes up
board space, and makes noise, but given that one can not rely on
3.3V supply on PMC slots, the 3.3V pins are pretty much worthless
(except for AC bypassing ;-)
Ivor
At 10:19 AM 3/28/00 -0500, you wrote:
>> On the board there is going to be a 5V to 3.3V regulator, but if the
>>host is providing 3.3V power in the system, then I want to turn off the
>>output of the 5V-3.3V regulator and power the board from the hosts 3.3V
>>power.
>
>Why not just use your on-board regulator all the time?
>
>There is some advantage to using a common supply if your card will be in the
>3.3V signaling environment, since both output and input threshold levels
>track the supply voltage in that environment. However, not in the 5V
>signaling environment.
>
>A common supply also avoids power sequencing problems, but if your card is
>going to sense and switch supply voltages, it may have the same problems
>anyway.
>
>Regards,
>Andy
>
>
>