Wednesday, November 16, 2011

Why does a voltmeter have to have a high resistance to measure an accurate value for an EM field?

This is in a circuit with a high-resistance voltmeter being used to measure the EM field of the cell. Please Help!|||Electro magnetic fields are usually measured by very small currents that are generated(induced) inside a metallic conductor (like a copper wire) when the conductor moves inside the field. These currents are very small in magnitude(unless u move the conductor at a very high speed, like turbines in power plants move the generator's armature), and since V=R/i , measuring a small current 'i' will show a very large Voltage value on the voltmeter(larger than it can show on a small dial), so for the accurate measurement the scale (range of values seen on a voltmeter) is reduced by reducing the V value, for which we increase the R value which is the high resistance you are refereeing to.





Hope this helps!|||First of all V = I X R not R/I I have no idea what harshsheath is talking about nor does he.





Let's make this simple. Really what Wesley is talking about is voltage. So he is really asking why do voltameters need a high resistance value to accurately measure a voltage.

Report Abuse


|||Answer is because the resistance of the voltmeter is put in parallel with the resistance of what the voltameter is measuring. If the resistance were small then the voltage would be way under the true voltage of what is being measured. The higher the resistance of meter, the more accureate reading

Report Abuse


|||There is always what is referred to as an internal resistance in a battery. If the volt meter had a low resistance, then there would be a much higher current flow through the circuit, and the internal resistance of the battery would itself have a voltage drop across it, and that would reduce the reading of the voltage drop across the volt meter: which is to say the volt meter would yield too low a reading.,|||Consider that electrical voltage is similar to water pressure and a voltmeter is used to measure a drop in electrical pressure between two points in a circuit without contributing significantly to the current flow causing the pressure drop. You would not want a lot of current bypassing the circuit resistor you are trying to evaluate (similar to allowing a lot of water to flow past a valve in a pipe?). Therefore, the voltmeter has a high resistance but a calibrated meter very sensitive to the small current it has to draw. That is opposed to an ammeter in which you want all of the current (electrons) to flow through the meter with minimal voltage drop.|||In order to make a measurement, a voltmeter will need to sample some of the current from the voltage source it is trying to measure. If a voltmeter requires too much sample current, then the act of making the measurement will affect the very voltage we are trying to measure giving us an inaccurate reading.





A voltmeter that draws too much current would cause a voltage drop to appear 'across' the cell's internal resistance so that the less voltage is available at the terminals where it is being measured.





Therefore, the best voltmeters are those that take as little current as possible in order to make a measurement and these are the high resistance or high impedance voltmeters types (normally with a high impedance FET input stage).

No comments:

Post a Comment