Average Grade is the Most Misleading Metric in Mining
Every mine trusts a particular number too much. For most mine sites, this number is the average grade.
Being average shows up in almost all planning documents, all board presentations, and all interactions between the processing superintendent and the mine planner. It is the number quoted in response to the question, how is the stockpile situated? It is also what feeds the mill and inquires whether this month’s production target is achievable.
The average grade is not wrong per se, but rather the grade is a fictitious number that has been smoothed out, rather than providing the actual number, which shows the enormous variability of the stockpile, and averaging out the variables that drive the profit and loss of a mine.
In a low-margin industry where a comfortable fiction can cost millions, relying on a smoothed-out average is a systemic risk.
The Illusion Averages Create
Averages can become a trap. A mine has 500,000 tonnes of material on the ROM pad, with an average copper grade of 0.82%. That number is real, and indeed, it comes from real sampling data. What it does not tell you is that 30% of that material is 0.45% grade, another 20% is above 1.1%, and a meaningful pocket near the eastern finger has a hardness index that will grind slower than anything the mill has seen this quarter.
The average grade conceals all of this. The subsequent decisions, blending sequences, reagent targets, and throughput forecasts, are all made against an average grade that no single tonne of ore represents.
It becomes compounding. The blending plan is based on the average. The flotation circuit is tailored to the average. The cyanide dosing is calibrated to the average. Then the actual feed comes and it’s not the average. Instead, it’s real, variable and unpredictable material that the circuit must absorb that gap. Sometimes it does, but often it does not.
Average grade is not what reaches the mill. It never was. It’s a number that describes what the ore body looks like in aggregate not what the plant will see tomorrow morning.
What gets hidden inside an average: within stockpile variability in grade across lifts and bays, hardness variation that directly affects grinding energy and throughput, shifts in mineralogy that change the behaviour of the ore in flotation, penalty elements arsenic, antimony, bismuth that may be concentrated in specific zones, moisture, and fines content that alters the stability of the circuit. Each of these can independently cause recovery shortfall, a reagent spike, a throughput hit, and together they can make a well-planned production day look like a firefight. This is why geomet modelling software exists; grade alone is never the full story.
The Real Cost of Not Knowing
Ask a metallurgist what keeps them up at night, and they rarely say 'grade.' They say recovery. They say stability. They say the unexplained drop that happened on the Wednesday afternoon shift took three days to diagnose.
Most shortfall analyses identify the same cause: a change in feed by the customer that the company had no way of anticipating. Several materials were tougher than expected and the mill was running coarser than target. A batch from another zone had high silica content, and the kinetics of flotation were altered. The grade was near the average. Everything else was not.
The response is predictable: operators calibrate, metallurgists analyse, adjust reagent additions, and eventually the performance stabilizes. This is usually just in time for the next unexplained deviation. This is not an operations failure, but rather an information failure. The circuit was engineered to absorb variability that was not possible to see because the planning layer had smoothed out the variability.
Reagent overuse is simply a problem we choose to ignore. Aside from setting a target with a grade average, we also must consider that actual feed changes continuously. In this case, the operators are conservative about the target, and they are overdosing. In a plant that processes thousands of tonnes daily, this is a significant overconsumption. Operations that have been able to see beyond the obvious have been able to achieve a thirty percent reduction in reagent consumption. For mining operations, AI will assist in maintaining energy efficient mining practices through improved feed control. With improved feed precision, we no longer must employ a defensive dosing strategy.
Stockpiles Represent the Largest Blind Spots
The ROM pad represents the first set of problems that arise from working with averages. Long term stockpiles are where we truly begin to see the cost of these averages arising.
This is especially the case with ore stockpiles, where most of the management has been based on very coarse assumptions, with the average grade more or less being an estimate based on the original resource models. These grades have been adjusted with the use of sampling campaigns and surveys. Most operations do not have this level of blending detail. It is extremely rare to have a complete overview of the stockpile, let alone where the spatial distribution is, how property grades vary across lifts, and how they have been emptied or if any have been omitted.
The impacts can be bothersome, such as potentially costing a company a significant amount of money. Because one operation has a block-level model of a long-term stockpile, they can account for approximately 0.1 million tonnes of previously disregarded medium and high-grade ore, which equates to about $11 million USD, and is available for reclamation. It wasn’t lost, just the model lacked the information to see it.
At Goldfields’ Cerro Corona operation, the transition to geometallurgical modelling based on confidence has changed how the team approaches the work. Planners, as opposed to a single expected grade, can work with confidence intervals for 70% of the stockpile volume. It is a completely different conversation about planning. Instead of what do we expect, the question becomes what’s the realistic range, and how do we create a blending strategy that optimally functions across that range. The time savings of 0.5 to 2 FTE is significant, but the major savings is to be able to make “good” decisions instead of “hope” decisions.
An expected grade does not tell you what could go wrong. A confidence interval does tell you how wrong it could go and how much. That is the value we should plan for.
Block-level modelling changes the possibilities. When a stockpile is a spatial distribution of blocks instead of a single average, carrying grade, hardness, mineralogy, and uncertainty estimates, the planning questions become much clearer. Which bays should we reclaim next to manage hardness while targeting grade? Where are the penalty hotspots, and how do we route around those? When will the model say we run into a grade cliff if we continue to pull from the current area?
These are normal questions that mine planners think about. Probabilistic ore stockpile management software, developed from truck trip data, pit surveys, polygons, and sampling, makes these questions answerable without GPS on every truck or the site being perfectly instrumented.
OreMax andDynaMax work with what your site already has truck trip logs, dump records, pit surveys, and routine sampling. That data gets structured into a spatial block model that tracks grade, hardness, mineralogy, and uncertainty down to the sub-stockpile level, giving planners something they can use to make a call.
Catching Variability Before It Reaches the Plant
Even with good stockpile intelligence, there is always some variability that will reach the plant. Ore is not a uniform material. No model is perfect. The real question is how quickly can you find the variability that is being mined and how quickly can you respond prior to the plant already struggling to cope with the variability?
The lab assay provides a traditional answer. It is reliable and accurate, but it is also slow. By the time a sample is collected, analysed, and feedback is provided to the metallurgist, the material has progressed through several more stages. Decisions are based on what the feed looked like two hours ago, not what it looks like now.
Edge-basedAI mining optimization software takes a different approach. It connects to site-mounted cameras and environmental sensors located where material enters (truck unloading, crusher feed) and analyses ore quality as it comes. It interprets, in real time, particle properties, texture, visual density, and ore type. This is real-time ore forecasting. AI has been used to predict outcomes based on historic trends. This system does not wait for a lab result. It utilizes real time data to adjust dosing and feed rates.
Real-time optimization for plants in themining industryhas resulted in meaningful improvements in operational deployments; improved recovery of precious metals or minerals during peak variability transitional feed periods; variability of manual overrides; and improved variability of reagent use and recovery of target minerals. Operators shift from catching problems to preventing problems.
The pivot is small in scale, but large in impact. Technology has not removed the human element but rather has improved information communication by providing more data to the operators. Information and data communication are improved in the sense that data is timely and accurate when used to inform an operator. Operators working with accurate, timely data consistently outperform those without it.
What sets mining apart from other process industries is the sheer unpredictability baked into every tonne of ore and that's exactly where probabilistic forecasting earns its place. Mining is riddled with uncertainty. No model can predict the exact grade of minerals that will be recovered from a specific bay on a specific day at a specific time.
It provides the probable range along with the level of prediction confidence. It illustrates the setbacks and the potential peaks. It classifies reclaim decisions as the riskier and the less risky. This is good enough to achieve better decisions. They may not be perfect, but clearly better.
Operations tend to be sceptical about probability approaches. They say that the predictions don’t tend to be accurate. The usual response is that an average prediction is not an accurate prediction either. An average prediction is often overconfident about its projection. It provides a number to convey a notion of certainty. That number is often a measure of the variability that leads to the true outcome, as that is the most important performance indicator.
A confidence interval that states the grade will likely be between 0.74% and 0.91% with a probability of 90% is more useful than an average of 0.82% without an expression of uncertainty. An ore blending optimization strategy based on the former will be more robust than one based on the latter. Mining feed optimization based on distributions rather than averages is what keeps circuits stable when the ore changes - and the ore always changes.
The Number Worth Trusting
Mining averages exist to hide things, not to present the full tangible truth. In mining, the truth is the underground ore body, the stockpiles, and the geometallurgy, which is hidden behind a single and simple number, which, as the saying goes, is only true until the plant says otherwise.
The operations that will consistently outperform on recovery, cost, and stability are the ones that stop planning against averages and start planning against distributions. That means block-level stockpile models that carry grade, hardness, mineralogy, and uncertainty all the way from pit to plant. It means real-time ore characterization at the point of entry, so variability is seen and acted on immediately. It means confidence intervals in planning conversations instead of single-point estimates that imply a certainty no geological system can deliver.
The barrier to entry is not a complete system redesign. It is a different way to use existing structured information that is better connected to the decisions at hand. The people, the equipment, the ore it’s all there. Connectivity is the final piece of the puzzle. Plan using certainty, not averages. Plan to use the number that the plant will see, not the comfortable number. If the number is an average, you're planning for surprises.
Experience the difference that confident planning brings. Schedule a demo with NTWIST to see how block-level stockpile intelligence and real-time ore characterization integrate with your operations from ROM pad to recovery.
Frequently Asked Questions
These are the types of questions we receive from mining teams when they begin to consider the possibility of planning at a level beyond the averages.
Q: If the average grade is so misleading, why does the mining industry continue to use it?
A: The average grade endures because of its simplicity. It has been the industry standard for so long that it has become a norm. It isn’t that mining teams are unaware of the variability that exists; it is that the resources have been too limited. You can’t graph spatial distributions of data with spreadsheets. Conventional ERP systems are not designed with probabilistic inventory models. Geometallurgical modelling software that includes block-level confidence intervals for operational (not just resource estimation) use has only recently become available. The standard isn’t correct; it has just been the only option until recently.
Q: What is geometallurgical blending software and how does it support operations?
A: This software is able to understand data pertaining to geology and metallurgy and take into consideration several parameters such as grade, hardness, mineralogy, penalty elements, and recovery potential, and use these parameters to create feed and blending plans that optimize performance of the plant beyond just meeting a specified grade. In contrast, traditional blending plans work to optimize the grade only. On the other hand,geometallurgical blending softwarethe economic factor, which includes throughput, recovery, and cost of reagents. Ultimately, it means that the mill is better set up for what is next and the blending strategy is resilient to the uncertainty that is always the case with real stockpile.
Q: Ore stockpile management software vs a regular inventory.
A: The standard system only talks about how much material you have. This is a system that records how much material you have and keeps a spatial block model of theOre stockpile management softwarethat incorporates data for grade, hardness, and mineralogy as well as uncertainty at a sub-stockpile level. In terms of operational value, the difference is that you have one system that simply records input and output data, and another that provides meaningful insights about the material and what it will ultimately achieve when it is processed at the plant.
Q: How does ROM pad management software enhance plant stability?
A: Rather than relying on pad-level averages,ROM pad management software provides planners and metallurgists with block-level insights into what materials are on the pad and the location of those materials. This software allows planners and metallurgists to select reclaim areas, develop short-term feed property forecasts, and proactively identify materials with high hardness or high penalty elements that are to be sent to the crusher. This visibility upstream improves circuit preparation, results in more accurate reagent addition, and mitigates unexpected changes in feed. The 30% reagent savings seen at some operations is a direct result of feed predictability.
Q: How does real-time plant optimization function within a mineral processing facility?
A: In the case of mining, real-time plant optimization is a system that uses measurements of mineral processing plant operation data and soft sensors, such as P80 estimated based on proxy measurements, and feed characterization to dynamically suggest and/or implement control actions as set points to optimize throughput, recovery, and cost. Real-time plant optimization works at a higher level than DCS and APC, but does not replace control loops, rather, it directs control loops toward better objectives. The goal is to maximize the value of the recovered metal per tonne processed within the defined constraints of the circuit and operating boundaries. This approach has resulted in increased throughput and recovery of 1-5 million dollars per year at multiple well-instrumented concentrator sites.
Q: Can AI mining optimization software work at operations without advanced instrumentation?
A: Yes, with appropriate scoping. Applications like OreMax and DynaMax can build probabilistic models of stockpiles and blocks based on truck trip logs, periodic surveys, and dump records captured in spreadsheets, without needing GPS on every truck. MillMax can begin with historian extracts and function in advisory mode even before any live DCS connections. The philosophy in NTWIST’s platform is to extract value from existing data, without any new data collection, from day one and then to evolve the solution as the customer’s trust and the integration of the solution matures. Mining industrial AI solutions will be able to deliver positive ROI without needing to have an advanced instrumentation program.
Stockpile confidence intervals portray values for a certain characteristic (grade, hardness, recovery) with a certain confidence level, rather than a single expected value, so in stockpile modelling, instead of saying a stockpile contains ore with 0.82% Cu, a probabilistic model would specify a range like 0.74 - 0.91% with 90% confidence. This range helps planners bottom-line blending, determine reclaim strategies with high uncertainty, and make robust decisions throughout a range of values instead of just optimal for the central estimate. This distinction in life-of-mine planning shows how well production estimates can be supported to management and investors.
Q: How do you reduce reagent costs by integrating digital twins into ore blending?
A: In ore blending, digital twins enable planners to model various reclaim and blending sequences and their downstream impacts. Instead of being reactive and finding out that a particular blend produced a flotation or leach chemistry problem when it has already been processed in the circuit, digital twin technology predicts such blend chemistry problems. A reagent target can be set in advance. Dosing strategies can be changed prior to the feed arriving. This capability, made possible by a stockpile model that updates continuously and feeds a plant optimization layer, converts the theoretical value of ore blending optimization into actual reagent cost savings and increased recoveries.
Q: Can NTWIST’s method be used for both gold and copper?
A: Yes, with some differences in use. For gold, there’s value in ROM pad, oxide/sulphide, cyanide optimization; for copper concentrators, value comes from the grind, flotation recovery, and penalty elements. The same real-time optimization architecture is applicable for both. The MineMax suite is designed for gold, copper, nickel and other hard-ore operations where processing mills are involved.
