Destabilisation Heat Treatment for High-Chromium White Iron

What Happens Inside High-Chromium Iron During Heat Treatment — and Why Getting It Wrong Is Irreversible

High-chromium white iron is the hardest ferrous wear material in common production use. Its abrasion resistance comes from the volume fraction of chromium carbides distributed through the microstructure — but achieving the full hardness potential of these alloys requires a heat treatment step that is frequently misunderstood and sometimes omitted. This article explains what destabilisation heat treatment does, how the critical parameters are selected, and what happens when they are wrong.


Why High-Chromium Iron Needs Heat Treatment

High-chromium white iron — typically GX260Cr27 or similar grades with 15–30% chromium and 2–4% carbon — derives its wear resistance from two sources: the chromium carbides (Cr₇C₃ and Cr₂₃C₆) that form during solidification, and the hardness of the matrix in which those carbides are embedded. The carbides themselves are extremely hard (1,200–1,800 HV) and provide the primary abrasion resistance. But the matrix hardness also matters — a soft matrix allows carbides to be undercut and dislodged from the wear surface rather than abrading gradually.

As-cast high-chromium iron has a matrix that is austenitic or partially martensitic depending on alloy composition and section cooling rate. The austenitic matrix is relatively soft — typically 45–55 HRC as-cast — and contains dissolved carbon and chromium that were not incorporated into carbides during solidification. Destabilisation heat treatment is the process that converts this as-cast condition into the high-hardness martensitic matrix that the alloy is capable of achieving.

What Destabilisation Does

When a high-chromium iron casting is heated to the destabilisation temperature range (typically 900–1,050°C depending on composition), two things happen simultaneously. First, the matrix austenitises — it becomes a homogeneous face-centred cubic phase capable of dissolving carbon and chromium. Second, and critically, secondary carbides precipitate from the austenite. These secondary carbides remove carbon and chromium from the matrix, reducing its alloy content. On subsequent air cooling or quenching, this lower-alloy austenite transforms to martensite at a higher martensite start temperature than the original high-alloy as-cast austenite — producing a hard martensitic matrix rather than retained austenite.

The net result is a microstructure with primary carbides (unchanged from casting) plus secondary carbides (precipitated during destabilisation) in a martensitic matrix. Hardness after destabilisation typically increases to 58–67 HRC from the 45–55 HRC as-cast condition, depending on composition and process parameters.

Temperature Selection

Destabilisation temperature is the most critical process variable. The correct temperature range depends on the alloy composition — specifically the carbon and chromium content — and the section thickness of the casting.

At temperatures below the effective destabilisation range, insufficient secondary carbide precipitation occurs. The matrix retains too much carbon and chromium, its martensite start temperature remains low, and significant retained austenite remains after cooling. Hardness is lower than the alloy is capable of, and wear resistance is correspondingly reduced.

At temperatures above the effective range, secondary carbides begin to dissolve back into the matrix rather than precipitating. Carbon and chromium content of the austenite increases again, martensite start temperature drops, and retained austenite content increases. The carbide volume fraction that provides abrasion resistance is reduced. In extreme cases, partial melting of the eutectic carbide network can occur, permanently damaging the microstructure.

For GX260Cr27 (26–28% Cr, 2.4–2.8% C), the effective destabilisation range is typically 950–1,020°C. For lower-chromium grades (15–20% Cr), the range shifts lower, typically 900–980°C. These ranges are not universal — they are starting points that require verification for specific alloy heats, because composition variation within grade specification affects the optimal temperature. A foundry producing these alloys consistently should have empirical data correlating composition, destabilisation temperature, and achieved hardness for their specific production conditions.

Holding Time and Section Thickness

The casting must reach thermal equilibrium at the destabilisation temperature through its full section before the transformation reactions can proceed uniformly. Holding time is therefore determined by section thickness, not by clock time alone. Thin sections (under 30 mm) equilibrate quickly; thick sections (100 mm and above, common in large crusher liners and mill components) require extended holding times to ensure the core reaches the set temperature.

Insufficient holding time produces a hardness gradient through the section — correct hardness at the surface, lower hardness in the core where the temperature was not maintained long enough for complete secondary carbide precipitation. For wear parts where the wear surface extends through a significant depth (large hammer mill hammers, thick cone crusher liners), a through-section hardness gradient means that hardness decreases as the wear surface recedes, shortening the effective wear life of the component.

As a general guide, holding times of 4–6 hours at temperature are typical for sections up to 75 mm; 6–10 hours for sections up to 150 mm. These are conservative estimates — tighter values can be established empirically for specific component geometries.

Cooling Method

After holding at the destabilisation temperature, the casting is cooled. The cooling rate must be sufficient to allow the destabilised austenite to transform to martensite rather than to pearlite or bainite. For most high-chromium iron grades with adequate alloy content, air cooling is sufficient — the alloy is hardenable in air, which is why it is sometimes referred to as an air-hardening alloy. Forced air cooling accelerates the process and is used for thicker sections or where production throughput requires shorter cycle times.

Quenching in oil or polymer solution is used for lower-alloy grades or very thick sections where air cooling is insufficient. Water quenching is generally avoided for high-chromium iron because the thermal shock of rapid quenching can cause cracking in already-brittle castings, particularly at section changes and sharp corners.

Some producers apply a tempering cycle after destabilisation and cooling — typically 200–260°C — to relieve quench stresses and reduce the risk of stress cracking in service. Tempering at these temperatures has minimal effect on hardness (a reduction of 1–3 HRC is typical) but can significantly reduce the risk of catastrophic fracture in applications with moderate impact loading.

Hardness Verification

Hardness measurement after destabilisation is the primary process verification. Rockwell C (HRC) is the standard scale for high-chromium iron at its service hardness range. Measurements should be taken at multiple locations on the casting surface — at least the geometric extremes and centre of each major face — because surface hardness variation reveals temperature uniformity issues in the furnace or loading arrangement.

Comparison against a specification range (rather than a minimum only) is informative: consistently high hardness at the upper end of the range, combined with microstructure examination showing minimal retained austenite, indicates a well-controlled process. Results clustered at the lower end of the range warrant investigation of whether the destabilisation temperature is being consistently achieved.

Metallographic examination — polished section etched with Vilella’s or similar reagent — provides direct evidence of the microstructure: the ratio of martensite to retained austenite in the matrix, the secondary carbide distribution, and the primary carbide morphology. This is the definitive check on process adequacy and is particularly valuable when hardness results are borderline or when a new alloy composition or component geometry is being processed for the first time.

Common Process Errors and Their Consequences

The most frequent error in destabilisation heat treatment of high-chromium iron is temperature undershoot — setting the furnace to the nominal destabilisation temperature without verifying that the casting itself reaches that temperature. For large castings in bogie hearth furnaces, the thermal mass of the charge and the furnace loading arrangement can result in casting temperatures significantly below the furnace set point for an extended period at the start of the hold cycle. Inadequate thermocoupling of the load — as opposed to the furnace atmosphere — is the underlying cause.

The second common error is insufficient hold time for section thickness. Thin test bars processed alongside production castings will show correct hardness even if the production castings have not reached temperature equilibrium through their full section. Process validation should always use thermocouples embedded in representative heavy-section locations of the actual casting geometry, not in free-hanging test specimens.

Temperature overshoot is less common but more damaging — partial dissolution of the primary carbide network cannot be reversed without re-melting the casting. Furnace temperature control system calibration and regular thermocouple survey of working zone uniformity are the practical controls against this failure mode.


Summary

Destabilisation heat treatment converts the as-cast microstructure of high-chromium white iron from a relatively soft austenitic or mixed condition to a hard martensitic matrix capable of fully supporting the abrasion-resistant carbide phase. The process is straightforward in principle but sensitive to temperature accuracy, holding time relative to section thickness, and cooling rate control. Hardness and microstructure verification after treatment are the necessary confirmation that the process has achieved its intended result — not an optional quality step.

For components where the full abrasion resistance of the alloy is required in service, the difference between a correctly destabilised casting and an inadequately treated one is not marginal. It is the difference between the material performing as specified and underperforming in a way that is indistinguishable to the eye but clearly evident in service life.


Related: Heat Treatment Capabilities · Crusher Hammers · Impact Plates · Manganese Grade Selection