Home | Engineering | Errors In Measurements

Mechanical - Metrology and Measurements - Concepts of Measurement

Errors In Measurements

   Posted On :  23.09.2016 12:12 pm

It is never possible to measure the true value of a dimension there is always some error. The error in measurement is the difference between the measured value and the true value of the measured dimension.





It is never possible to measure the true value of a dimension there is always some error. The error in measurement is the difference between the measured value and the true value of the measured dimension.


Error in measurement = Measured value - True value


The error in measurement may be expressed or evaluated either as an absolute error or as a relative error.


Absolute Error



True absolute error:


It is the algebraic difference between the result of measurement and the conventional true value of the quantity measured.


Apparent absolute error:


If the series of measurement are made then the algebraic difference between one of the results of measurement and the arithmetical mean is known as apparent absolute error.


Relative Error:


It is the quotient of the absolute error and the value of comparison use or calculation of that absolute error. This value of comparison may be the true value, the conventional true value or the arithmetic mean for series of measurement. The accuracy of measurement, and hence the error depends upon so many factors, such as:


-calibration standard -Work piece -Instrument


-Person -Environment etc



Types of Errors


1. Systematic Error


These errors include calibration errors, error due to variation in the atmospheric condition Variation in contact pressure etc. If properly analyzed, these errors can be determined and reduced or even eliminated hence also called controllable errors. All other systematic errors can be controlled in magnitude and sense except personal error.

These errors results from irregular procedure that is consistent in action. These errors are repetitive in nature and are of constant and similar form.



2. Random Error


These errors are caused due to variation in position of setting standard and work-piece errors. Due to displacement of level joints of instruments, due to backlash and friction, these error are induced. Specific cause, magnitude and sense of these errors cannot be determined from the knowledge of measuring system or condition of measurement. These errors are non-consistent and hence the name random errors.



3. Environmental Error


These errors are caused due to effect of surrounding temperature, pressure and humidity on the measuring instrument. External factors like nuclear radiation, vibrations and magnetic field also leads to error. Temperature plays an important role where high precision is required. e.g. while using slip gauges, due to handling the slip gauges may acquire human body temperature, whereas the work is at 20°C. A 300 mm length will go in error by 5 microns which is quite a considerable error. To avoid errors of this kind, all metrology laboratories and standard rooms worldwide are maintained at 20°C.





It is very much essential to calibrate the instrument so as to maintain its accuracy. In case when the measuring and the sensing system are different it is very difficult to calibrate the system as an whole, so in that case we have to take into account the error producing properties of each component. Calibration is usually carried out by making adjustment such that when the instrument is having zero measured input then it should read out zero and when the instrument is measuring some dimension it should read it to its closest accurate value. It is very much important that calibration of any measuring system should be performed under the environmental conditions that are much closer to that under which the actual measurements are usually to be taken.

Calibration is the process of checking the dimension and tolerances of a gauge, or the accuracy of a measurement instrument by comparing it to the instrument/gauge that has been certified as a standard of known accuracy. Calibration of an instrument is done over a period of time, which is decided depending upon the usage of the instrument or on the materials of the parts from which it is made. The dimensions and the tolerances of the instrument/gauge are checked so that we can come to whether the instrument can be used again by calibrating it or is it wear out or deteriorated above the limit value. If it is so then it is thrown out or it is scrapped. If the gauge or the instrument is frequently used, then it will require more maintenance and frequent calibration. Calibration of instrument is done prior to its use and afterwards to verify that it is within the tolerance limit or not. Certification is given by making comparison between the instrument/gauge with the reference standard whose calibration is traceable to accepted National standard.


Tags : Mechanical - Metrology and Measurements - Concepts of Measurement
Last 30 days 332 views

​Read Or Refer


Recent New Topics :