Uncategorized
Question:
How do you measure electrical continuity of steel in concrete, and why is it important? BG
Answer:
As reinforced concrete structures age, the steel can become increasing vulnerable to corrosion due to the ingress of chlorides or atmospheric carbon dioxide [1]. Electrochemical techniques can be used both to measure the corrosion and to control it. Reference electrode potential mapping is widely used on bridges exposed to deicing salts and structures exposed to marine conditions to map the extent and risk of corrosion [2]. Cathodic protection, along with lesser used techniques such as realkalisation and chloride extraction, are all well documented treatments with ISO and European standards [3,4,5]. All of these processes require that for the area under investigation or treatment, there is a direct metal-to-metal contact between all the steel bars in the reinforcing cage that is being assessed or protected. In the absence of such connections, under cathodic protection, stray currents may occur, leading to the formation of anodes where current leaves a disconnected reinforcing bar, leading to corrosion. If electrochemical measurements are being taken then any separation between rebars can lead to a cell with its own potential, giving misleading measurement of the steel to reference electrode potential.
This may seem a simple thing to measure, but out in the field with limited equipment it is important that operatives and engineers have clear method statements for carrying out measurements and criteria for defining continuity, whether carrying out steel potential mapping with a reference electrode and high impedance voltmeter, or installing a cathodic protection system. The problem is that concrete is a damp medium, with high resistivity, and there are multiple parallel connections between reinforcing bars. Before a steel cage or other structure is embedded in concrete or immersed in water, it is easy to measure the electrical continuity accurately with a digital multimeter or a resistance meter such as a Megger or a Nilsson meter. Once embedded in concrete, and especially when corrosion is initiated, it is harder to be sure that there is metal to metal contact. Stirrup steels round the main bars in beams can be a particular problem once corrosion initiates. Some older structures have very light reinforcement and even electrically separated mats of steel. Considerable effort may be needed to establish continuity both for assessment and when applying cathodic protection.
This issue was addressed in 1990 by Jack Bennett. Jack invented the ‘Elgard’ anodes for cathodic protection of steel in concrete along with many other products. As part of the development work, he carried out laboratory and field studies to ensure that steel bars were adequately bonded when impressed current cathodic protection was applied. Bennett presented his study and findings at a NACE conference committee meeting in the early 1990s, but never published it, however, he did circulate an internal Eltech memo of the work.
In researching the literature, I became aware that no one else had published anything on this subject, but Jack’s findings were being used in the standards on cathodic protection of steel in concrete such as
BS EN ISO 12696. I contacted Jack, who has now retired, and he, along with his former employers agreed that the memo should be published.
I therefore transcribed it into a Structural Concrete Alliance Technical Note [6].
The memo states that Bennett found the use of a Nilsson meter gave inaccurate measurement, indicating continuity where none existed. This is fortuitous, as, using a high impedance Fluke multimeter he got more accurate measurements. High impedance meters are always available on site when taking reference electrode potential measurements for investigation purposes and when installing cathodic protection systems.
It is interesting to note that Bennett found the most accurate method of determining continuity was to measure the DC potential difference between bars, which should be less than 1 mV. A slightly less accurate method was to measure the (DC) resistance which should be less than 1 ohm. In both cases, the leads should be reversed and the readings repeated. In BS EN 12696, the resistance technique appears to be given priority over the potential technique, which is not the priority that Bennett recommended. I would always recommend using both techniques, especially is there is any doubt about continuity.
There has been discussion in standards about whether the criterion for potential difference or resistance should be higher or lower. My reaction has always been that when anyone can offer hard data we should consider it, but until someone does so, then these are the criteria we should use. It would be good to see someone repeat or improve on Bennett’s work. but until then it is what we have to go on to ensure we have electrical continuity in our reinforcement cages.
References
1. J. P. Broomfield, Corrosion of steel in Concrete, 2nd Edition, Taylor and Francis, 2007.
2.ASTM C876 (2015) Standard test method for corrosion potentials of uncoated reinforcing steel in concrete.
3. BS EN ISO 12696 (2016), Cathodic protection of steel in concrete.
4.BS EN 1504-1 (2016) Electrochemical realkalization and chloride extraction treatments for reinforced concrete Part 1 realkalization.
5.BS EN 1504-2 (2021) Electrochemical realkalization and chloride extraction treatments for reinforced concrete Part 2 Chloride Extraction.
6.J. P. Broomfield (2021) The measurement of electrical discontinuity for steel in concrete subject to cathodic protection and other electrochemical treatments. Technical Note 29 Structural Concrete Alliance, Bordon, Surrey.
John Broomfield
Ask the Expert
Question:
What errors are most likely to occur when measuring dry film thicknesses on steel, and how can they be avoided. PS
Answer:
Dry film thickness measuring probably causes the most conversation/arguments on site than anything else, this is normally born out of the absence of an inspector test plan or no conversation/agreements with client/contractor/inspector.
Errors made during dry film testing are due to several reasons, including due to taking measurements before the paint or paint system is hard dry. A contractor will paint until the job for the day is done, and of course this will continue well into the afternoon, then the Inspector, or Supervisor will take a measurement with a dft gauge. If the paint is not ‘hard dry’, the probe will push into the coating giving an incorrect lower reading than may have been expected. Before taking a dft reading, ensure the paint is hard dry by pushing a fingernail into the paint. If the fingernail leaves a depression, then the paint is not hard dry. If of course no depression is left, then the paint is hard dry.
However, the most common error is not calibrating the measuring device to the same blast profile as the uncoated steel. Often this is not possible, so alternative calibration methods need to be used, as described below.
(1) Measure the blast profile before application of the paint, and after applying the system allow to become hard, then measure the dry film thickness using a digital gauge (or other). Then subtract the blast profile to give the true dft of the paint system. For example, if the steel surface has a typical profile after blasting of 50 microns, and the applied paint measures 300 microns, then then total dft is 250 microns (covering the peaks of the blast).
(2) Calibrate the dft gauge using a surface profile comparator to the expected surface profile, and then accept that the measured dft is the correct reading.
(3) Calibrate the gauge on a piece of smooth steel, then measure the dft of the paint, and subtract 50 microns as being the nominal blast profile.
Whatever method is used it must be agreed pre-contract and should be included in the inspection test plan. In the absence of a test plan, one should be created and accepted by all parties before painting commences, this prevents disagreements at a later stage.
Kevin Harold, Paintel Ltd
Answer:
Dry Film Thickness or (DFT) is probably the single most important measurement made during inspection, or quality control of a protective coating application. Even the most basic coating specification will inevitably require the DFT to be measured, which is considered to be the most important factor determining the durability and longevity of a coating system. The thickness of each coating layer in a system, and the total system DFT will have to be measured and recorded to show that the specified system will meet the desired durability.
There are many mistakes which can be, and are often, made when measuring DFTs. Often its believed that it’s a case of simply putting a probe on a coated substrate and taking the measurement, and that’s where the numerous issues occur.
DFT is typically recorded with either a magnetic pull-off gauge (Banana gauge or Type 1) or an electro-magnetic constant pressure probe gauge (Type 2). Both these types of gauges are non-destructive (will not damage the protective coating during the inspection) and are the most commonly used methods for measurement of film thickness of protective coatings.
The Type I gauge works by recording the magnetic force needed to pull off the gauge from a ferrous substrate. Simply a barrier or a coating between the substrate and the gauge’s magnet reduces this magnetic attraction, which can then be measured i.e. the force required to pull the gauge magnet from the coated substrate is shown on the gauge as the film thickness of the coating material.
There are benefits in using a type 1 gauge, however there are often great challenges for an inspector with calibration, which has proved to cause major problems on projects. For example, determining the Base Metal Reading (BMR). For accurate calibration of Type I gauges the standard SSPC PA2 specifies that after calibration using a NIST test standard, or equivalent standard DFT shim with traceable calibration, that a measurement of the blast profile should be taken in order to achieve an accurate DFT reading as possible. This is not 100% accurate and can affect the resulting DFT reading. This BMR measurement is carried out on the blasted surface using the Type I gauge and which depicts an imaginary magnetic line in the blast profile This reading is always deducted from the final average reading of the DFT. This is typically done with a banana gauge or Type 1 DFT measuring gauge. Before use the BMR and NIST Standard deviation must be carried out and recorded, the inspector should always remember that the gauge should only be used on non-metallic coatings on a metallic or ferrous substrate.
However, the Type 1 gauge is ideal for use in environments where the use of electronic instruments is difficult, e.g. inflammable atmospheres in oil and gas production, and for underwater# dry film coating thickness inspection.
Type II gauges or the constant pressure probe gauge works by measuring changes in the magnetic flux within the probe of the gauge, the probe must remain in contact with the substrate during the reading or measuring process. The Inspector should be aware that the following may affect any readings taken.
• The magnet should be clean and free from surface contaminates such as iron or steel grits the inspector should also check the substrate is free of any contaminants which may adhere to the magnet prior and during DFT inspection.
• DFT readings should be taken only when the protective coating film is dry as if the coating is uncured or tacky the actual film will hold the magnet past the point when the magnet should have detached.
• The inspector should note that vibrations may cause the magnet to release prematurely resulting in a higher or inaccurate reading.
• Readings should generally not be taken within 25 mm or 1 inch of an edge as the magnetic fields in this area will interfere with the magnetic forces between the substrate and the gauge.
• Always ensure that you have a spare battery or a Type 1 Gauge for back up.
As with the Type 1 gauge, the Type 2 gauge must be calibrated with a traceable DFT shim on an uncoated area in order to account for the blast profile before any measurements are taken.
There are also other instruments used for DFT measurements, but which will damage the coating film. The most commonly used instruments are the Paint Inspection Gauges (PIG gauges) or Tooke gauge. These instruments are termed as destructive test methods due to the necessity to cut into the paint film to obtain the measurement.
A further issue with DFT measurements, and the one which causes the main issues on site is frequency of testing. The specification should always state the requirements for frequency of DFT measurements and film thickness acceptance criteria, or at the very least specify a standard to which DFT measurements should be carried out in accordance with. The number of measurements that will be made is important to all parties involved in coating works. The contractor and inspector obviously need to be in sync with requirements for such an imperative measurement and not to confuse one another in the field.
Lee Wilson, Corrtech Ltd