A version of this article appears
in the January 2007 issue of :

-------------------------------
Metal Finishing

Calibration Terms - Coating Thickness Gages

The following discussion provides definitions, explanations, limitations and practical examples of metrology terminology as it relates to DeFelsko coating thickness measurement gages. Resources used to develop this document primarily include technical articles and standards published by international organizations such as SSPC, ISO, ANSI, AGA, NACE and ASTM.

In their Paint Application Specification No.2 (PA-2), SSPC identifies two types of gages: Type 1 and Type 2. We describe them as follows…

       Type 1: Pull-Off Gages

In Type 1 pull-off gages (PosiPen or PosiTest), a permanent magnet is brought into direct contact with the coated surface. The force necessary to pull the magnet from the surface is measured and interpreted as the coating thickness value on a scale or display on the gage. The magnetic force holding the magnet to the surface varies inversely as a non-linear function of the distance between magnet and steel, i.e., the thickness of the dry coating. Less force is required to remove the magnet from a thick coating.

       Type 2: Electronic Gages

A Type 2 electronic gage (PosiTector 6000 or PosiTest DFT) uses electronic circuitry to convert a reference signal into coating thickness.

Characterization

Characterization is the process by which an electronic instrument is taught to relate signals received from its probe tip into actual coating thickness measurements. The result of the characterization process is a calibration curve that is built into the instrument. Based on the complexity of the curve, it may also include allowances for other impacts such as the ambient temperature.

Each DeFelsko instrument is individually characterized using measurements taken on traceable calibration standards that cover the full range of the instrument. It is this feature that allows DeFelsko instruments to take meaningful measurements directly out of the box for most applications.

Reference Standards

A reference standard is a sample of known thickness against which a user may verify the accuracy of their gage. Reference standards are typically coating thickness standards or shims. They may or may not be traceable to a National or International registry. If agreed to by the contracting parties, a sample part of known (or acceptable) thickness may be used as a reference standard for a particular job.

Coating Thickness Standards

For most instruments, a coating thickness standard is typically a smooth metallic substrate with a nonmagnetic (epoxy) coating of known thickness that is traceable to national standards (NIST). The substrate is ferrous (steel) for magnetic gages or non-ferrous (aluminum) for eddy current and ultrasonic gages. High accuracy coating thickness standards are used to characterize and calibrate gages as part of the manufacturing process. The same standards are available for purchase by customers to be used as calibration standards in a calibration lab or as check standards in the field or on the factory floor.

Coating thickness standards to be used with PosiTector 200 ultrasonic gages can also be solid plastic (polystyrene) blocks that have been machined to a flat smooth surface. In addition to a known thickness traceable to national standards, these standards also have a known sound velocity.

Coating Thickness Standards are purchased as accessories to help meet the increasing customer need to fulfill ISO/QS-9000 and in-house Quality Control requirements. Many customers find it more practical to verify accuracy of their own gages in-house, rather than utilize DeFelsko's calibration services. To assist these customers, sets of calibration standards are available with nominal values selected to cover the range of each DeFelsko gage. All standards come with a Calibration Certificate showing traceability to NIST. In addition, DeFelsko makes calibration procedures available on our website.

Shims

A shim is a thin strip of non-magnetic plastic, metal or other material of known uniform thickness used to verify the operation and make adjustments to dry film thickness gages. DeFelsko’s shims are plastic. While a plastic shim is able to take the form of most substrates to be measured, the accuracy of the shim is more limited than our coating thickness standards. Therefore when using a shim to make adjustments with Type 2 (electronic) gages, it is important to combine the tolerance of the shim with the tolerance of the gage before determining the accuracy of measurements.

Plastic shims are often used to adjust a gage in the intended range of use over the surface of the representative substrate material.

Shims are not recommended for use with Type 1 (mechanical pull-off) gages. Shims are usually fairly rigid and curved and do not lie perfectly flat, even on a smooth steel test surface. Near the pull-off point of the measurement with a mechanical gage, the shim frequently springs back from the steel surface, raising the magnet too soon and causing an erroneous reading.

When using a coating thickness gage, three steps ensure best accuracy…

1. Calibration
2. Verification
3. Adjustment
Measurement

1. Calibration

Calibration is the high level, controlled and documented process of measuring traceable calibration standards over the full operating range of the gage and verifying that the results are within the stated accuracy of the gage. If necessary, gage adjustments are made to correct any out-of-tolerance conditions.

Calibrations are typically performed by the gage manufacturer or by a qualified laboratory in a controlled environment using a documented process. The coating thickness standards used in the calibration must be such that the combined uncertainties of the resultant measurement are less than the stated accuracy of the gage. Typically a 4:1 ratio between the accuracy of the standard and the accuracy of the Gage is sufficient. The outcome of the calibration process is to restore/realign the gage to meet/exceed the manufacturer’s stated accuracy.

    Calibration Interval

A Calibration Interval is the established period between recalibrations (recertifications) of an instrument. As per the requirements of ISO 17025, DeFelsko does not include calibration intervals as part of Calibration Certificates issued with our coating thickness gages.

For customers seeking assistance in developing their own calibration intervals, we share the following experience. Non-age related factors have shown to be more critical in determining calibration intervals. These factors are primarily the frequency of use, the application in question, as well as the level of care taken during use, handling and storage. For example, a customer that uses the gage frequently, measures on abrasive surfaces, or uses the gage roughly (i.e. drops the gage, fails to replace the cover on the probe tip for storage, or routinely tosses the gage into a tool box for storage) may require a relatively shorter calibration interval.

From both theoretical analysis and practical experience, the impact of temperature and moisture on the gage is very minimal. In addition, manufacturing processes are designed to minimize post calibration changes in gage performance. Even in the event of drift, PosiTector 6000 measurements are typically linear and are thus compensated for prior to use by the "zero" function of the gage.

DeFelsko recommends that customers establish gage calibration intervals based upon their own experience and work environment. Customer feedback suggests one year as a typical starting point. Furthermore our experience suggests that customers purchasing a new instrument can safely utilize the instrument purchase date as the beginning of their first calibration interval. The limited effect of age minimizes the importance of the actual calibration certificate date.

    Calibration Certificate

A Calibration Certificate is a document that records actual measurement results and all other relevant information to a successful instrument calibration. Calibration Certificates clearly showing the traceability to a national standard are included by DeFelsko with most new recalibrated or repaired instruments.

    Traceability

Traceability is the ability to follow the result of a measurement through an unbroken chain of comparisons, all the way back to a fixed international or national standard that is commonly accepted as correct. The chain typically consists of several appropriate measurement standards, the value of each having greater accuracy and less uncertainty than its subsequent standards.

    Recalibration (Recertification)

Recalibration, also referred to as recertification, is the process of performing a calibration on a used instrument. Recalibrations are periodically required throughout the life cycle of an instrument since probe surfaces are subject to wear that may affect the linearity of measurements.

2. Verification of Accuracy

Verification is an accuracy check performed by the instrument user on known reference standards prior to gage use for the purpose of determining the ability of the coating thickness gage to produce reliable values compared to the combined gage manufacturer’s stated accuracy and the stated accuracy of the reference standards. The process is intended to verify that the gage is still functioning as expected.

Verification is typically performed to guard against measuring with an inaccurate gage at the start or end of a shift, before taking critical measurements, when an instrument has been dropped or damaged, or whenever erroneous readings are suspected. If deemed appropriate by the contracting parties, initial agreement can be reached on the details and frequency of verifying gage accuracy.

If readings do not agree with the reference standard, all measurements made since the last accuracy check are suspect. In the event of physical damage, wear, high usage, or after an established calibration interval, the gage should be removed from service and returned to the manufacturer for repair or calibration.

The use of a checking measurement standard is not intended as a substitute for regular calibration and confirmation of the instrument, but its use may prevent the use of an instrument which, within the interval between two formal confirmations, ceases to conform to specification.

3. Adjustment (Optimization, Calibration Adjustment)

Adjustment is the physical act of aligning a gage's thickness readings (removal of bias) to match those of a known sample in order to improve the accuracy of the gage on a specific surface or within a specific portion of its measurement range.

In most instances it should only be necessary to check zero on an uncoated substrate and begin measuring. However the effects of properties such as substrate (composition, magnetic properties, shape, roughness, edge effects) and coating (composition, mass, surface roughness), as well as ambient and surface temperatures may require adjustments to be made to the instrument.

Most Type 2 gages can be adjusted on known reference standards such as coated parts or shims. However, Type 1 gages such as the PosiPen and PosiTest have nonlinear scales. Since their adjustment features are linear no adjustments should be made. Instead, the user should take a base metal reading (BMR) and subtract that value from the coating thickness reading.

With a Type 2 gage where an adjustment method has not been specified, a 1-Pt Adjustment is typically made first. If inaccuracies are encountered then a 2-Pt Adjustment should be made.

    1-Pt Adjustment (Offset, Correction Value)

1-pt adjustments involve fixing the instrument's calibration curve at one point after taking several readings on a known sample or reference standard. If required a shim can be placed over the bare substrate to establish such a thickness. This adjustment point can be anywhere within the instrument's measurement range, though for best results should be selected near the expected thickness to be measured.

Zeroing is the simplest form of 1-pt adjustment. It involves the measurement of an uncoated sample or plate. In a simple zero adjustment, a single measurement is taken and then the reading is adjusted to read zero. In an average zero adjustment, multiple measurements are taken, then the gage calculates an average reading and automatically adjusts that value to zero.

    2-Pt Adjustment

This method is preferred for very unusual substrate materials, shapes or conditions. It provides greater accuracy within a limited, defined range.

2-pt adjustments are similar to 1-pt except the instrument's calibration curve is fixed at two known points after taking several readings on known samples or reference standards. The two thicknesses must be within the instrument's measurement range. Typically points are selected on either side of the expected coating thickness. An advantage of the PosiTector 6000 is the accuracy throughout its entire measurement range. This typically enables zero (uncoated) to be one of the two points used in a 2-pt adjustment.

Factory calibration adjustment settings can be restored at any time by performing a reset.

Base Metal Reading

The base metal reading (BMR) is the measured effect of substrate roughness on a coating thickness gage that is caused by the manufacturing process (for example, castings) or surface profile (roughness) –producing operations (for example, power tool cleaning, abrasive blast cleaning, etc.). Non-compensation for the base metal effect can result in an overstatement of the true thickness of the coating. The base metal reading is measured, recorded and deducted from the thickness of each coat in order to correctly state the thickness of the coating over the surface roughness.

A BMR is a zeroing technique typically used with Type 1 (mechanical pull-off) gages on rough surfaces. Adjustments to a Type 1 gage are linear in nature, however the scale of the gage is non-linear. It is thus important not to adjust the gage to read zero on the bare substrate.

The BMR is calculated as a representative value (average) of several measurements taken from several locations across a bare substrate.

Repeatability

Coating thickness gages are necessarily sensitive to very small irregularities of the coating surface or of the steel surface directly below the probe center. Repeated gage readings on a rough surface, even at points very close together, frequently differ considerably, particularly for thin films over a rough surface with a high profile.

Roughness

If a steel surface is smooth and even, its surface plane is the effective magnetic surface. If the steel is roughened, as by blast cleaning, the "apparent" or effective magnetic surface that the gage senses is an imaginary plane located between the peaks and valleys of the surface profile. Gages read thickness above the imaginary magnetic plane. If a Type 1 gage is used, the coating thickness above the peaks is obtained by subtracting the base metal reading (BMR). With a correctly adjusted Type 2 gage, the reading obtained directly indicates the coating thickness.

Back to Technical Articles