Page 809 - Basic Electrical Engineering
P. 809
Sensitivity of measurement is a measure of the change in instrument output
which will occur when the input quantity, i.e., the quantity being measured
changes. Fig. 11.5 shows the output readings of measured quantity of a
certain variable, say, current in a circuit. Sensitivity, by definition is the
gradient or slope of the straight line drawn as in Fig. 11.5. Higher the slope,
higher is the sensitivity. Sensitivity is high when there is large deflection of
the instrument pointer for a small value of the quantity being measured. For
example, if the deflection of the pointer is by 10° for the input voltage of 1 V,
then sensitivity is 10°/Volt.
Resolution: While taking measurement, if the input is slowly increased from
a certain value, it may be found that the output does not change until a certain
increment is exceeded. This small value of input quantity is expressed as
resolution. Hence, we may define resolution as the smallest value of input
quantity that can be detected by an instrument with certainty. For example, let
us assume that an ammeter has a uniform scale having 10 divisions. The full
scale deflection is intended to record 100 V as shown in Fig. 11.6. If each
division on the scale is divided into 10 parts, the smallest amount of reading
that can be read by the instrument is 1 V. The resolution of the instrument is
therefore 1V.
Figure 11.6 Illustrates resolution of an instrument

