The following discussion provides definitions, explanations, limitations and practical examples of metrology terminology as it relates to DeFelsko Coating Thickness Measurement gages. Resources used to develop this document primarily include technical articles and standards published by international organizations such as SSPC, ISO, ANSI and ASTM. The intent is to develop a common platform of reference for DeFelsko documentation including literature, manuals, technical articles, correspondence and web materials.
In Type 1 pull-off (PosiTest or PosiPen) gages, a permanent magnet is brought into direct contact with the coated surface. The force necessary to pull the magnet from the surface is measured and interpreted as the coating thickness value on a scale or display on the gage. The magnetic force holding the magnet to the surface varies inversely as a non-linear function of the distance between magnet and steel, i.e., the thickness of the dry coating. Less force is required to remove the magnet from a thick coating.
A Type 2 electronic gage (PosiTector) uses electronic circuitry to convert a reference signal into coating thickness. Electronic ferrous gages operate on two different magnetic principles. Some use a permanent magnet that when brought near steel, increases the magnetic flux density at the pole face of the magnet. Coating thickness is determined by measuring this change in flux density, which varies inversely to the distance between the magnet and the steel substrate. Hall elements and magnet resistance elements positioned at the pole face are the most common ways this change in magnetic flux density is measured. However, the response of these elements is temperature dependent, so temperature compensation is required.
Other ferrous electronic gages operate on the principle of electromagnetic induction. A coil containing a soft iron rod is energized with an AC current thereby producing a changing magnetic field at the probe. As with a permanent magnet, the magnetic flux density within the rod increases when the probe is brought near the steel substrate. This change is detected by a second coil. The output of the second coil is related to the coating thickness. These gages also need temperature compensation due to the temperature dependence of the coil parameters.
Characterization is the process by which an instrument is taught to relate signals received from its probe tip into actual coating thickness measurements. The result of the characterization process is a calibration curve that is built into the instrument. Based on the complexity of the curve, it may also include allowances for other impacts such as the ambient temperature.
Each DeFelsko instrument is individually characterized using measurements taken on traceable calibration standards that cover the full range of the instrument. It is this feature that allows DeFelsko instruments to take meaningful measurements directly out of the box for most applications.
A reference standard is sample of known thickness against which a user may verify the accuracy of their gage. Reference standards are typically coating thickness standards or shims. If agreed to by the contracting parties, a sample part of known (or acceptable) thickness may be used as a thickness standard for a particular job.
For most instruments, a coating thickness standard is typically a smooth metallic substrate with a nonmagnetic (epoxy) coating of known thickness that is traceable to national standards (NIST). The substrate is ferrous (steel) for magnetic gages or non ferrous (aluminum) for eddy current gages. High tolerance coating thickness standards are used to characterize and calibrate gages as part of the manufacturing process. The same standards are available for purchase by customers to be used as calibration standards in a calibration lab or as check standards in the field or on the factory floor.
Coating thickness standards to be used with ultrasonic gages are solid plastic (polystyrene) blocks that have been machined to a flat smooth surface. In addition to a known thickness traceable to national standards, these standards also have a known sound velocity.
Calibration Standards are purchased as accessories to help fulfill the increasing customer need to fulfill ISO/QS-9000 and in-house Quality Control requirements. Many customers find it more practical to calibrate their own gages in-house, rather than utilize DeFelsko's calibration services. To facilitate these customers, sets of calibration standards are available with nominal values selected to cover the full range of each DeFelsko gage. All standards comes with a Calibration Certificate showing traceability to NIST.
A shim is a flat non-magnetic (plastic) part of known thickness. While a shim is often able to take the form of the substrate to be measured, the accuracy of the shim is more limited than a coating thickness standard. Therefore when using a shim to make calibration adjustments with Type 2 (electronic) gages, it is important to combine the tolerance of the shim with the tolerance of the gage before determining the accuracy of measurements.
Shims are not recommended for use with Type 1 (mechanical pull-off) gages. Such shims are usually fairly rigid and curved and do not lie perfectly flat, even on a smooth steel test surface. Near the pull-off point of the measurement with a mechanical gage, the shim frequently springs back from the steel surface, raising the magnet too soon and causing an erroneous reading.
Calibration is the controlled and documented process of measuring traceable calibration standards and verifying that the results are within the stated accuracy of the gage. Calibrations are typically performed by the gage manufacturer or by a qualified laboratory in a controlled environment using a documented process. The coating thickness standards used in the calibration must be such that the combined uncertainties of the resultant measurement are less than the stated accuracy of the gage.
A Calibration Interval is the established period between recalibrations of an instrument. As per the requirements of ISO 17025, DeFelsko does not include calibration intervals as part of Calibration Certificates issued with our PosiPen, PosiTest, PosiTector 6000 and 100 coating thickness gages.
For customers seeking assistance in developing their own calibration intervals, we share the following experience. Non shelf life related factors have shown to be more critical in determining calibration intervals. These factors are primarily the frequency of use, the application in question, as well as the level of care taken during use, handling and storage. For example, a customer that uses the gage frequently, measures on abrasive surfaces, or uses the gage roughly (i.e. drops the gage, fails to replace the cover on the probe tip for storage, or routinely tosses the gage into a tool box for storage) may require a relatively shorter calibration interval. From both theoretical analysis and practical experience, the impact of temperature and moisture on the gage is very minimal. In addition, manufacturing processes are designed to minimize post calibration changes in gage performance. Even in the event of drift, gage measurement is typically linear and is thus compensated for prior to use by the "zero" function of the gage.
Though DeFelsko recommends that customers establish gage calibration intervals based upon their own experience and work environment, customer feedback suggests a one year as a typical starting point. Furthermore our experience suggests that customers purchasing a new instrument can safely utilize the instrument purchase date as the beginning of their first calibration interval. The minimal effect of shelf life minimizes the importance of the actual calibration certificate date.
A Calibration Certificate is a document that records actual measurement results and all other relevant information to a successful instrument calibration. Calibration Certificates clearly showing the traceability to a national standard are included by DeFelsko with every new, recalibrated or repaired instrument.
Traceability is the ability to follow the result of a measurement through an unbroken chain of comparisons, all the way back to a fixed international or national standard that is commonly accepted as correct. The chain typically consists of several appropriate measurement standards, the value of each having greater accuracy and less uncertainty than its subsequent standards.
Recalibration, also referred to as recertification, is the process of performing a calibration on a used instrument. Recalibrations are periodically required throughout the life cycle of an instrument since probe surfaces are subject to wear that may affect the linearity of measurements.
In theory, customers with traceable thickness reference standards and copies of calibration procedures available from DeFelsko's website can recalibrate their own gages. Customers are also limited by requirements of their own quality systems as well as their ability to control the conditions of the recalibration.
Calibration verification is an accuracy check performed by the instrument user on known reference standards covering the expected range of coating thickness. The process is intended to verify that the gage is still functioning as expected.
Verification is typically performed to guard against measuring with an inaccurate gage at the start or end of a shift, before taking critical measurements, when an instrument has been dropped or damaged, or whenever erroneous readings are suspected. If deemed appropriate by the contracting parties, initial agreement can be reached on the details and frequency of verifying gage accuracy. If readings do not agree with the reference standard, all measurements made since the last accuracy check are suspect. In the event of physical damage, wear, high usage, or after an established calibration interval, the gage should be removed from service and returned to the manufacturer for repair or calibration. The use of a checking measurement standard is not intended as a substitute for regular calibration and confirmation of the instrument, but its use may prevent the use of an instrument which, within the interval between two formal confirmations, ceases to conform to specification.
Calibration adjustment is the alignment of a gage's thickness readings (removal of bias) to match those of a known sample in order to improve the accuracy of the gage on a specific surface or in a specific portion of its measurement range.
In most instances it should only be necessary to check zero on an uncoated substrate and begin measuring. However the effects of properties such as substrate (composition, magnetic properties, shape, roughness, edge effects) and coating (composition, surface roughness), as well as ambient and surface temperatures may require adjustments to be made to the instrument.
Most Type 2 gages can be adjusted on known reference standards such as coated parts or shims. However, Type 1 gages such as the PosiPen and PosiTest have nonlinear scales. Since their adjustment features are linear no adjustments should be made. Instead, the user should take a base metal reading (BMR).
With a Type 2 gage where a calibration adjustment method has not been specified, a 1-Pt Calibration Adjustment is typically made first. If inaccuracies are encountered then a 2-Pt Calibration Adjustment should be made.
1-pt calibration adjustments involve fixing the instrument's calibration curve at one point after taking several readings on a known sample or reference standard. If required a shim can be placed over the bare substrate to establish such a thickness. This adjustment point can be anywhere within the instrument's measurement range, though for best results should be selected near the expected thickness to be measured.
Zeroing is the simplest form of 1-pt adjustments. It involves the measurement of an uncoated sample or plate. In a simple zero calibration adjustment, a single measurement is taken and then the reading is adjusted to read zero. In an average zero calibration adjustment, multiple measurements are taken, then the gage calculates an average reading and automatically adjust that value to zero.
2-pt calibration adjustments are similar to 1-pt except the instrument's calibration curve is fixed at two known points after taking several readings on known samples or reference standards. The two thickness must be within the instrument's measurement range. Typically points are selected on either side of the expected coating thickness. An advantage of the PosiTector 6000 is the accuracy throughout its entire measurement range. This typically enables zero (uncoated) to be one of the two points used in a 2-pt calibration.
The base metal reading (BMR) is a zeroing technique to be used with Type 1 (mechanical pull-off) gages on rough surfaces. Adjustments to a Type 1 gage are linear in nature, however the scale of the gage is non-linear. It is thus important not to adjust the gage to read zero on the bare substrate. It is recommended to calculate the BMR for an uncoated part and subtract it from the actual readings attained from a coated part. The BMR is calculated as a representative value (average) of several measurements taken from several locations across a bare substrate.
If a steel surface is smooth and even, its surface plane is the effective magnetic surface. If the steel is roughened, as by blast cleaning, the "apparent" or effective magnetic surface that the gage senses is an imaginary plane located between the peaks and valleys of the surface profile. Gages read thickness above the imaginary magnetic plane. If a Type 1 gage is used, the coating thickness above the peaks is obtained by subtracting the base metal reading. With a correctly adjusted Type 2 gage, the reading obtained directly indicates the coating thickness.