Hand held dry film thickness (DFT) gages are common inspection tools used by applicators and inspectors. With a little care and maintenance, mechanical and electronic instruments can be relied upon to give many years of accurate and dependable service.
Good operation begins with reading the manual. All instruments have subtle operational differences. Record the make, model, serial number and date of purchase inside the manual and highlight maintenance and calibration tips.
There’s a good chance someone you know misunderstands the terms Calibration and Calibration Interval. They would be surprised to learn that, not only can they not calibrate their gage, but there is usually no requirement for annual recertification either.
ASTM D7091 defines calibration as “the high-level, controlled and documented process of obtaining measurements on traceable calibration standards over the full operating range of the gage, then making the necessary gage adjustments (as required) to correct any out-of-tolerance conditions.” It goes on to point out that calibration, “is performed by the equipment manufacturer, their authorized agent, or by an accredited calibration laboratory in a controlled environment using a documented process.”
A Calibration often results in the issuance of a document called a Certificate of Calibration (Figure 1). This document records actual measurement results and all other relevant information to a successful instrument calibration and clearly shows the traceability to a national standard. Job specifications often require proof of a recent calibration.
Recalibration (or recertification) is periodically required throughout the life cycle of an instrument since the accuracy of most measuring devices degrades with use. A Calibration Interval is the established period between recalibrations of an instrument. As per the requirements of ISO 17025, most manufacturers do not include calibration intervals as part of Calibration Certificates. Why? Because they don’t know how frequently the gage is used, what environment it is used in, and how well it is looked after.
If you don't have experience with an instrument, one year is a good starting interval between calibrations. This can be adjusted with experience, and regular verification (see below). Customers with new instruments can utilize the instrument purchase date as the beginning of their first calibration interval. The negligible effect of shelf life minimizes the importance of the actual calibration certificate date.
A calibration certificate does not guarantee accuracy will be maintained throughout the calibration interval. Myriad factors detrimentally affect gage operation as soon as you open the box. That is why most Standards require regular verification of accuracy.
To guard against measuring with an inaccurate gage, accuracy and operation should be verified before each use, typically at the beginning of every work shift. It should be rechecked when large numbers of measurements are being obtained, or if the gage is dropped or suspected of giving erroneous results.
Accuracy checks are performed by measuring traceable reference standards: either shims or coated metal standards. The average of a series of readings should be within the combined tolerances of both the gage and the reference standard (Figure 2).
Traceability is the ability to follow the result of a measurement through an unbroken chain of comparisons all the way back to a fixed international standard that is commonly accepted as correct. The chain typically consists of several appropriate measurement standards, the value of each having greater accuracy and less uncertainty than its subsequent standards.
Most dry film thickness gauges are factory calibrated to perform well on flat, smooth carbon steel. Your application may be different. Generally, four conditions affect accuracy and must be corrected for: surface roughness, geometry (curvature, edge effect), composition (metal alloy, magnetic properties, temperature), and mass (thin metal).
To prevent against these or any other factors from causing gage inaccuracies, check that the average of a series of measurements on the uncoated substrate is within the gage tolerance at zero. Alternatively, check the known thickness of a shim placed over that uncoated substrate.
It used to be that industry standards advised against measuring closer than one or two inches from an edge. Modern probes can usually measure much closer. In fact their accuracy generally diminishes only when they overhang.
You check this the way you check most other issues -- by measuring the uncoated substrate to verify the average of a series of measurements is within gage tolerance at zero. Stripe coats are often best measured with microprobes designed for the purpose of measuring on small surfaces.
Steel surfaces are frequently cleaned by abrasive impact prior to the application of protective coatings. Measuring on these surfaces is more complicated than for smooth surfaces. The effect on gage measurements increases with profile depth and also depends on the design of the probe and the thickness of the coating.
Users are taught that this “anchor pattern” can cause gages to read high (Figure 3). But when it comes to adjusting for this profile it seems every user has a favorite method. Which one is right?
SSPC-PA 2 proposes several solutions depending upon the instrument type and the particular situation. Similar methods are suggested by ASTM D7091 and ISO 19840.
Mechanical pull-off (Type 1) gages have non-linear scales that cannot be adjusted. Therefore, the average of at least ten bare surface measurements is calculated to generate a base metal reading (BMR). This value is subtracted from future coating thickness readings.
Most electronic (Type 2) gages can be adjusted by the user following the manufacturer’s instructions. A common method is to simulate a coating covering the major peaks of the profile. A shim of known thickness is placed over the surface profile and measured. The gage is adjusted to match that shim’s thickness.
If access to the uncoated substrate is not possible, ISO 19840 has correction values to subtract from DFT readings over fine, medium and coarse ISO 8503 profile grades.
Now that we know it is common practice to adjust a gage to the thickness of a shim, it is important to be aware that it can add significant error to future gage readings.
Measuring instruments have stated accuracy or tolerance statements issued by the manufacturer. When you make gage adjustments on shims, resultant gage measurements are less accurate. For example, if the accuracy of a properly calibrated gage is ± 1% and the shim’s thickness is accurate to within ± 5%, the combined tolerance of the gage and the shim will be slightly greater than ± 5% as given by the formula in Figure 4.
Once the gage has been put into service, a surprising amount of trouble can be avoided with regular visual examinations of the probe. Look for obvious damage particularly to the measuring surface or to the probe cable. Constant-pressure probes should move up and down freely. Damaged, scratched or worn probes should be tested for accuracy on reference standards and replaced when necessary. Metal filings, dust and paint should be carefully removed with a cloth.
Avoid prolonged exposure to hot surfaces and allow the probe to cool between measurements. Respect rough surfaces by lowering the probe carefully and never drag it sideways unless the probe was designed for that use. Plastic shims of known thickness can be placed onto these surfaces to afford the probe some protection. Subtract the thickness of the shim from the measured thickness and be mindful of the additional measurement tolerance resulting from use of the shim.
Indications that a probe may need service include lower than expected readings (i.e. probe tip wear), higher than expected readings (i.e. foreign material stuck on it) and erratic measurements (i.e. component failure).
Modern instruments are designed to reduce operator influence. But you may not know that damage can result from holding the probe improperly.
Gages come in all different shapes and sizes. Get to know the proper way to hold and operate your particular model. The majority of hand held instruments take one measurement at a time. Lift the probe away from the surface between measurements. Dragging the probe reduces probe life.
Constant pressure mechanisms built into most modern electronic DFT gages ensure the probe settles perpendicular to the surface and eliminates operator pressure from influencing the measurement result. Holding the probe improperly overrides these mechanisms and can reduce the life of the probe. It can cause high readings when the probe is tilted or low readings when the probe is pressed into soft coatings.
Are readings affected by radio transmitters, residual magnetism from welding operations, large motors or cell phones? You might be surprised by what matters and what doesn’t.
Dry film thickness instruments that measure over steel operate on a magnetic principle. So it stands to reason that gage accuracy might be adversely affected by variations in the steel’s inherent magnetic properties. One of those insidious problems is residual magnetism, magnetism that is left behind in steel after an external magnetic field is removed. Magnetic clamps and plasma cutting are two source examples. Buried pipes will pick up magnetism from the earth’s magnetic field over time. The effect is usually not pronounced and can be removed by degaussing. Check its effect on the gage by measuring zero on the uncoated steel (or the thickness of a shim placed on the uncoated steel).
Strong stray magnetic fields produced by electrical equipment can interfere with the operation of instruments that use magnetic principles. Erratic readings can result from measuring on electric motors or measuring near a large motor as it starts up. Strong electromagnetic emissions from radio towers or antennas may also interfere with instrument operation. To minimize the impact from external electromagnetic fields ensure that your dry film thickness gage instrument comes with a Declaration of Conformity. The declaration of conformity confirms that the manufacturer has tested the immunity of the instrument against EMC according to international standards. EN 61326-1:2013 is an example of one such standard.
As you can imagine, these are unlikely edge cases. Verifying gage operation on known reference standards will alleviate most concerns.
The words “reading” and “measurement” get used synonymously. SSPC-PA 2 makes an interesting distinction by defining a “reading” as a single instrument result and a “measurement” as the average of a series of readings.
A single reading should seldom be trusted whether you are making a thickness determination or adjusting to a shim. Repeated gage readings, even at points close together, often differ due to surface irregularities of the coating and the substrate. Debris on the surface, local emission interferences and improper operator technique are just some of the other things that can detrimentally influence results.
Get the reassurance that statistics provide. Take several readings. Discard any unusually high or low values that are not repeated consistently. The resultant average of the acceptable gage readings is considered to be the coating thickness measurement for that location.
Modern instruments compensate for many sources of inaccuracy, but not all. Your best source of dry film thickness measurement knowledge is the manufacturer’s instructions backed by their technical support network and industry standards such as those issued by SSPC, NACE, ASTM and ISO.
DAVID BEAMISH is President of DeFelsko Corporation, a New York-based manufacturer of hand-held coating test instruments sold worldwide. He has a degree in Civil Engineering and has more than 25 years of experience in the design, manufacture, and marketing of these testing instruments in a variety of international industries including industrial painting, quality inspection, and manufacturing. He conducts training seminars and is an active member of various organizations including NACE, SSPC, ASTM and ISO.