[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

why chipset needn't PERR#?



Hi,experts
We're using a chipset and I find that there is no PERR# pin within the PCI interface.And in the system the only PCI slot's PERR# pin is left unconnected(but with pull up resistor).
I read in PCI spec that chipset used in mainboard is excluded from the requirement to implement the PERR# signal. According to the explanation of the PCI spec and Tom Shanley's "PCI system architecture",the chipset surely can learn whether parity error occurs when performing reading. But when initiating a write  ,how can the chipset get to know that a parity error is detected by the target? 
The author of "PCI system architecture" explains this by assuming no PCI slots in system and no PERR# generated by PCI device. Obviously,it's just an assumption!Can anybody tell me how does a chipset get the information of detecting of parity errors by other devices?
I'm confused about that and any light threw on me will be appreciated!
Thanks in advance.

Liu Yunfei