??? 10/07/10 09:44 Read: times |
#178966 - Resistors not good way to reduce voltage Responding to: ???'s previous message |
A series resistor works well when a 3.3V chip has 5V compliant inputs. The series resistor limits the rush current when the signal is toggled, while at the same time reducing problems in case that 3.3V chip (such as a processor) happens to turn the direction of the signal and activelly tries to fight. Also, some chips marked as having 3.3V-compliant inputs requires a series resistor.
A series resistor isn't enough if the 3.3V input is not 5V-compliant. You may get away with a series resistor and body diodes clamping the voltage. But out-of-spec is still out-of-spec. And reducing the voltage with two resistors as voltage divider means you always get current flowing when the signal is held high. So in the end, using resistors isn't really a good idea for adapting the voltage. |