Footprint Finder Tool For Sublevel Caving
Footprint Finder Tool For Sublevel Caving
Abstract
This paper discusses a new software tool developed specifically for use in a sublevel caving project/mine. The
tool includes a number of new algorithms or technologies which when combined provide a very efficient way
to evaluate a complete sublevel layout in a matter of seconds.
The core technologies imbedded in the tool are simulation of material mixing for dilution forecasting using
Template Mixing algorithm, optimisation of the draw factors or extraction percentages per ring, and rapid
sequencing/scheduling of the resulting ore tonnages.
Examples are presented showing how the rapid analysis can allow sensitivity studies on tunnel spacing,
mining rate and economics, face shapes and draw or extraction strategies. A real-world example from the
De Beers Venetia Underground project is presented.
Concluding remarks will discuss how this new tool fits in with the existing industry tools for open pit, open
stope, block cave and sublevel cave operations optimisation.
Keywords: orebody evaluation, SLC mining method evaluation, SLC footprint assessment
1 Introduction
Mining engineers within mining companies and consulting groups are continually looking for rapid ways to
evaluate projects for different mining methods. This is demonstrated by the success of existing tools within
the industry such as open pit optimisers, stope optimisers and the Footprint Finder tool within GEOVIA PCBC.
(Isabel 2013). Any of these tools would look to identify a volume of rock to be mined and then a sequence in
which the blocks within that volume will be mined. By applying a mining rate, an approximate mining
production schedule can then be evaluated.
To date, very little has been published for such a tool applied to sublevel cave (SLC) mining projects. Many
consultants will have used Excel sheets or adaptations of other optimisation tools. However, for caving
operations, the addition of significant dilution which is strongly affected by mining sequence complicates the
process. This paper describes a new tool, Footprint Finder for SLC (FFSLC), which tackles this problem.
For SLC, the dilution calculation process is made particularly complicated by the fact that the material
extraction history from one level affects the material available in future deeper levels. Thus, the overall
problem is quite non-linear and not well-suited to solutions using linear programming technologies. With
open pits and open stopes, the usual assumption is that 100% of the material blasted will be mined (as close
as possible taking into account relatively low levels of dilution). However, with SLC, the fractions mined
(referred to as extraction percentages) in the paper are variable from as low as 40% to as high as 500% in
some cases. In general, low extraction percentages are used near the top of the orebody in order to establish
the caving process and limit early entry of dilution while higher extraction percentages are used deeper down
to attempt to improve overall recovery of the orebody and any overlying material with reasonable grades.
This adds another level of complexity in the overall sequencing and scheduling process, namely the
estimation of the extraction percent to apply to each mining point (ring).
In most other mining methods, the detailed mining geometry (e.g. toes and crests in an open pit or draw
points in a block cave) are simplified down to the optimisation or processing of a regular block model.
A similar approach has been adopted for FFSLC. Material attributes within each block are used as inputs and
a relatively simple process can be used to estimate an undiscounted dollar value for each of these blocks.
What one then seeks is an improved or potentially optimal way in which to mine the blocks (or fractions of
blocks). However, in any realistic evaluation, a large number of sensitivity runs are required to assess mining
economics, mining sequence, production rate, etc., so set-up time and execution speed are very important.
Preparation of mining units is relatively simple but does need to take account of some complicating factors
such as variable mining level spacing (e.g. 25 m for five levels and then 30 m for the remaining deeper levels),
separate mining of development and production tonnes (from tunnels and rings respectively), internal waste,
ore blankets, overlying dilution or allowance for failure of open pit material and others.
Percent frozen variable will ‘freeze’ a portion of a slice preventing it from being mined as primary recovery.
However, a small portion of the frozen portion may be ‘eroded’. This is demonstrated in Figure 1, column 1
for slice 1 (at the bottom). By increasing the extraction percent from 1 (100%) to 1.4 (140%) (Figure 2), the
primary recovery increased from 48 to 49%. What this shows, and has been demonstrated in marker studies
and real mining, is that mining much more from a ring does not necessarily result in extracting much more
ore (Power 2004). The simulated dilution in this example jumps from 19 to 40%.
Figure 2 shows the results of increasing the extraction percent for slices 1 and 2 from 100 to 120%, and 140%
respectively. Overall recovery of the target slices (ore) increases, but so does the total simulated dilution
(slice 1).
The above and related testing showed that the TM1D algorithm was performing very well and giving results
in agreement with typical marker behaviour and which conserved material.
What values of extraction percent will yield the maximum total dollar value?
An informal background in non-linear optimisation techniques suggested the following overall approach for
each vertical column:
Start from an initial estimate for Ei for each slice i.
Make a new guess by changing one of the Ei estimates. The choice of i and whether to increase or
decrease the value can be done in a pseudo random manner. Don’t make changes outside of the
specified constraints.
Re-evaluate the mixing array based on the updated Ei values and then evaluate the updated dollar
value. Keep the new estimate if this is better than the previous estimate.
Continue the above for the number of iterations set by the user (typically 500 to 10,000).
Periodically introduce a much larger change to the Ei value. This is loosely equivalent to a genetic
‘mutation’ used on genetic algorithms. It is aimed to see whether a better value can be obtained
by nudging the solution away from a local maximum in search of a larger value as part of a
different ‘hill’.
There are no mathematical guarantees that the above algorithm will find the global solution for every
column, but all work to date has shown that the algorithm works well. In general, if a user suspects that a
better strategy might be available, then this can be achieved by changing the initial estimate. For example, if
we do one run starting from all minimal values and a second from all maximal values and the result is the
same, that is likely the true maximum.
An early example showing how the extraction percent profile improves or changes versus dollar value is
shown in Figure 4.
Figure 4 Extraction percent profile and dollar value improvement with iterations
Mining shut-off versus total shut-off for footprint delineation/extraction percent calculations.
Simulation of development tonnages pre-mined from development tunnels.
Simulation of various horizontal tunnel spacings.
Limiting maximum mining levels or sectors.
Changing strategies from ore recovery to time of recovery to dilution management.
Figure 6 Variability of extraction percent profiles with changing dilution and discounting assumptions
5 Case studies
Unfortunately, getting real data on which to conduct real studies seems to be getting ever more difficult as
mining companies seek to keep data confidential against general publication. For this reason, detailed results
are limited. To date, however, the tool has been used on around 20 different projects at different levels of
complexity and this has exposed a number of different geometries and mining conditions from scattered, to
near-vertical to inclined deposits with marginal economics, to complete mining history for completed projects.
The following examples are presented here:
Pocket Aces (fictitious simulated diamond deposit).
Walker Bay (fictitious simulated Au/Cu deposit).
Venetia Mine underground project.
Figure 7 Six levels and overall layout for Pocket Aces example
Extraction percentages are then optimised (in less than 1 min in this example) to give an alternative
extraction scenario. The results are shown in Figure 8.
Figure 8(a) shows in situ tons without mixing for 14.6 Mt at 50.24 carats per hundred tonnes (cpht).
Figure 8(b) shows the results with one iteration (not optimised) using the simple constant extraction
percentages typically used in current projects. This shows 13.6 Mt at 47.91 cpht. Figure 8(c) shows the results
with 300 iterations to optimise each vertical column resulting in 18.7 Mt at 44.34 cpht. The simulated
discounted dollar value for the two runs goes from $696 M to $820 M. Although somewhat artificial, the
example does clearly show that the optimisation process is effective.
Figure 9 Extraction percentage of blocks from the block model before and after optimisation
The optimised case increased in value by nearly 6% (Table 1), while also increasing the average gold and
copper grade by 4.64% and 1.68% respectively.
Table 1 Tons and grade comparison for base and optimised cases
During 2017, the Venetia underground project undertook a project to optimise the production output from
K01 and K02. The objective was to optimise the ring extraction percentage for K01 and K02 and optimise the
K02 layout.
Although the focus of this case study is on the K02 orebody, Footprint Finder was effectively used in K01.
The optimisation focused on the local extraction percentage where applied and this resulted in a 3%
improvement in recovered carats.
The K02 orebody optimisation process was driven by a strategic initiative to attempt to help with the open
pit transition to underground mining by accelerating K02 production. An iterative process for the
optimisation was developed and the process was as follows:
Define extraction strategy.
Model extraction scenarios.
Evaluate results and conclusion.
The next steps were to use the SLC Footprint Finder on the K02 orebody to test and compare the results from
the iterative process described above.
The base case scenario for K02 was an eight-level SLC with a high extraction percentage applied to lower
levels to draw the 200 m lift cave.
Research (Villa 2012) shows that a recovery curve can be used to calibrate the production scheduling in the
SLC. K02 optimisation used a recovery curve to focus on post-primary recovery to identify the areas where
high or low extraction needs to be applied. Villa (2012) proposed that primary, secondary, tertiary,
quaternary and quintenary recovery levels are represented as in Figure 11. The figure shows that 40–50% of
the extracted tonnes in a ring could come from levels above the current mining horizon.
Various extraction scenarios were created using the grade zone to determine what the ideal extraction
percentage should be depending on whether the post-primary recovery material is zero grade, low grade,
medium grade or high grade. The conclusion was that if there was higher grade material available for
post-primary recovery, the extraction percentage was increased, and if lower grade material was available,
the extraction percentage was decreased.
It was concluded that scenario C01 was the best scenario to apply due to its improved NPV and recovered
grade. Improved results were achieved when the LG zones were excluded from the plan, and where planned
rings were located below the LG zone. The results showed that the planned rings extraction percentages
should be decreased. Scenario A04 had a higher capital requirement as the number of levels increased
significantly and it also showed no benefit for lower operating cost with a high extraction percentage.
The total duration of the project was three weeks.
5.3.4 Test and compare results from iterative process using Footprint Finder
With a good understanding of the resource, it was very easy to setup Footprint Finder to start the
optimisation. The setup process described in Section 1.1 was done and the first evaluation run was
completed. A couple of exploratory evaluation runs were completed to test the functionality and sensitivity
to the inputs. The revenue parameter had the greatest impact and it was good that negative revenue could
be applied to dilution, to penalise the dilution entry, as this has a major impact on processing plant capacity.
The optimisation quickly started to plateau and it was found that around 1,000 iterations were enough for
the K02 evaluation.
The flexibility around the range of extraction percentages to apply to the planned rings can limit the
extraction to just removing swell or even apply a zero extraction percentage. The process quickly identified
areas of the orebody that did not contribute to the NPV and could effectively be removed from the mine
plan. Comparing the results from Footprint Finder to the long iterative process, the general layout of the
mining level was similar and thus the C01 layout was kept unchanged to continue the optimisation process.
The next step in the process was to limit Footprint Finder to only the targeted areas. By excluding all material
outside of the layout, the expected extracted tonnage profile was more accurate. During this optimisation
run, the number of iterations was increased to 10,000 and the incremental change reduced from 5 to 1%.
This increased the running time but the optimisation results were more accurate.
Figure 14 shows the rings on the horizontal axis and the extraction percentages on the vertical axis.
The comparison clearly shows where there was an under-extraction in high-grade areas and over-extraction
in lower-grade areas. These findings highlighted the importance of setting the maximum limit as the
algorithm will continue mining ignoring over-extraction risks, e.g. creating an airgap. Using global percentage
(per level) is not optimal. It is important to apply local extraction percentages (per ring) as this will ensure an
improved value proposition for the orebody.
The objective to optimise the ring extraction percentages was achieved and the result was a 13% increase in
recovered carats from the K02 orebody.
The financial impact of this increase is clearly shown in Figure 15 where the NPV of the Footprint Finder
scenario exceeds all previous scenarios. The optimisation delivered the best results in operating cost, grade,
capital efficiency and dilution.
Figure 15 Financial analysis for iterative optimisation process, including the Footprint Finder result
The most noticeable improvement compared to the iterative process is in duration of the evaluation period.
The full evaluation took two days and this included training, bug fixes and multiple runs with multiple
parameters.
6 Concluding remarks
Footprint Finder is a quick and effective tool to evaluate a sublevel cave mining method.
This tool can be rolled out at any level of study and operation. In early study phases, it can evaluate the
expected tonnage and grade profile by only using the block model. As an optimisation tool, it can be used in
later study phases to optimise layout and extraction percentages. By using the past tonnes function, the
optimisation can also be applied to any operating sublevel cave.
The purpose of an optimisation tool is to help the engineer, or planner, select the best input to produce a
detailed plan. An optimisation tool should be able to derive a solution quickly and easily. The processing time
and total duration of the Footprint Finder project show an effective tool that produces quality results in a
fast and repeatable process.
The new Footprint Finder tool for sublevel caving fills an important gap not covered by other tools such as
open pit, block caving and stope optimisers.
Acknowledgement
The authors acknowledge and thank Dassault Systèmes, Canada and De Beers Consolidated Mines for the
financial assistance to conduct the research and development leading to the development of FFSLC. We also
thank De Beers Venetia mine for permission to use selected results in this paper.
References
Diering, T 2007, ‘Template mixing: an alternative depletion engine for block scheduling’, Proceedings of the 33rd International
Symposium on the Application of Computers and Operations Research in the Mineral Industry, Gecamin Ltda, Santiago,
pp. 313–320.
Isabel, A 2013, Efficient Evaluation of Block Cave Footprints for A Range of Elevations, Gemcom white paper, Gemcom Software
International Inc., Vancouver.
Power, G 2004, ‘Full scale SLC draw trials at Ridgeway Gold Mine’, in A Karzulovic & MA Alfaro (eds), Proceedings of MassMin 2004,
Instituto de Ingenieros de Chile, Santiago, pp. 225–230.
Villa, D 2012, Calibration of a Mixing Model for Sublevel Caving, MSc thesis, The University of British Columbia, Vancouver.