Measurement accuracy of digital multimter GVDA

Apr 07, 2022

Оставить сообщение

measurement accuracy of smart multimter

Accuracy refers to the maximum allowable error in a specific use environment. In other words, precision is used to indicate

How close the DMM's measurement is to the actual value of the signal being measured.

For DMMs, accuracy is usually expressed as a percentage of reading. For example, the meaning of 1 percent reading accuracy is: when the display of the digital multimeter is 100.0V, the actual voltage may be between 99.0V and 101.0V.

Specific values may be added to the basic accuracy in the detailed description. Its meaning is the number of words to be added to transform the rightmost end of the display. In the previous example, the accuracy might be marked as ±(1 percent plus 2). Therefore, if the GMM reads 100.0V, the actual voltage will be between 98.8V and 101.2V.

The accuracy of an analog meter is calculated in terms of full{{0}}scale error, not the displayed reading. The typical accuracy of an analog meter is ±2 percent or ±3 percent of full scale. The typical basic accuracy of a DMM is between ±(0.7 percent plus 1) and ±(0.1 percent plus 1) of reading, or even higher.


2


Отправить запрос