Data assimilation in the minerals industry: Real-time updating of spatial models using online production data

Tom Wambeke

Research output: ThesisDissertation (TU Delft)

203 Downloads (Pure)

Abstract

Declining ore grades, extraction at greater depths and longer hauling distances put pressure on maturing mines. Not enough new mines will be commissioned on time to compensate for the resulting shortages. Ore-body replacement rates are relatively low due to a reduced appetite for exploration. Development times are generally increasing and most new projects are remote, possibly pushing costs further upwards.

To reverse these trends, the industry must collect, analyse and act on information to extract and process material more productively (i.e. maximize resource efficiency). This paradigm shift, driven by digital innovations, aims to (partly) eliminate the external variability that has made mining unique. The external variability results from the nature of the resource being mined. This type of variability can only be controlled if the resource base is sufficiently characterized and understood.

Recent developments in sensor technology enable the online characterization of raw material characteristics and equipment performance. To date, such measurements are mainly utilized in forward loops for downstream process control. A backward integration of sensor information into the resource model does not yet occur. Obviously, such a backward integration would significantly contribute to the progressive characterization of the resource base.

This dissertation presents a practical updating algorithm to continuously assimilate recently acquired data into an already existing resource model. The updating algorithm addresses the following practical considerations. (a) At each point in time, the latest solution implicitly accounts for all previously integrated data (sequential approach). During the next update, the already existing resource model is further adjusted to honour the newly obtained observations as well. (b) Due to the nature of a mining operation, it is nearly impossible to formulate closed-form analytical expressions de- scribing the relationship between observations and resource blocks. Rather, the relevant relationships are merely inferred from the inputs (the resource model realizations) and outputs (distribution of predicted observations) of a forward simulator. (c) The updating algorithm is able to assimilate noisy observations made on a blend of material originating from multiple sources and locations. Differences in scale of support are dealt with automatically.

The developed algorithm integrates concepts from several existing (geo)statistical techniques. Co-Kriging approaches for example are designed to integrate both direct and indirect measurements and are well capable to handle differences in accuracy and sampling volume. However, they do fail to extract information from blended measurements and can not sequentially incorporate new observations into an already existing resource model. To overcome the latter issue, the co-Kriging equations are merged into a sequential linear estimator. Existing resource models can now be improved using a weighted sum of differences between observations and model-based predictions (forward simulator output). The covariances, necessary to compute the weights, are empirically derived from two sets of Monte Carlo samples (another sta- tistical technique); the resource model realizations (input forward simulator) and the observation realizations (output forward simulator). This approach removes the need to formulate analytical functions modelling spatial correlations, blending and difference in scale of support.

The resulting mathematical framework bears some resemblances to that of a dy- namic filter (Ensemble Kalman filter), used in other research areas, althoughthe under- lying philosophy differs significantly. Weather forecasting and reservoir modelling, for example, consider dynamic systems repetitively sampled at the same locations. Each observation characterizes a volume surrounding the sample locations. Mineral resource modelling, on the other hand, focuses on static systems gradually sampled at different locations. Each observation is characteristic for a blend of material originating from multiple sources and locations. Each part of the material stream is sampled only once, the moment it passes the sensor.

Various options are implemented around the mathematical framework to either reduce computation time, memory requirements or numerical inaccuracies. (a) A Gaussian anamorphosis is included to deal with suboptimal conditions related to non- Gaussian distributions. The algorithm structure ensures that the sensor precision (mea- surement error) can be defined on its original units and does not need to be translated into a normal score equivalent. (b) An interconnected parallel updating sequence (double helix) can be configured to avoid a covariance collapse (filter inbreeding). This occurs as degrees of freedom are lost over time due to the empirical calculation of the covariances. (c) A neighbourhood option is implemented to constrain computation time and memory requirements. Different neighborhoods need to be considered simul- taneously as material streams are blended. (d) Two covariance correction options are implemented to further inhibit the propagation of statistical sampling errors originating from the empirical computation of covariances.

A case specific forward simulator is built and run parallel to the more generally applicable updating code. The forward simulator is used to translate resource model realizations (input) into observation realizations (output). Empirical covariances are subsequently lifted from both realization sets and mathematically describe the link between sensor observations and individual blocks in the model. This numerical inference avoids the cumbersome task of formulating, linearising and inverting an analytical forward observation model. The application of a forward simulator further ensures that the distribution of the Monte Carlo samples already reflect the support of the concerned random values. As a result, the necessary covariances, derived from these Monte Carlo samples, inherently account for differences in scale of support.

A synthetic experiment is conducted to showcase that the algorithm is capable of assimilating inaccurate observations, made on blended material streams, into an already existing resource model. The experiment is executed in an artificial environment, representing a mining environment with two extraction points of unequal production rate. A visual inspection of cross-sections shows that the model converges towards the ”true but unknown reality”. Global assessment statistics quantitatively confirm this observation. Local assessment statistics further indicate that the global improvements mainly result from correcting local estimation biases.
Another 125 artificial experiments are conducted to study the effects of variations in measurement volume, blending ratio and sensor precision. The experiments investigate whether and how the resource model and the predicted observations improve over time. Based on the outcome, recommendations are formulated to optimally design and operate a monitoring system.

This work further describes the pilot testing of the updating algorithm at the Tropi- cana Gold Mine (Australia). The pilot aims to evaluate whether the updating algorithm can automatically reconcile ball mill performance data against the spatial Work Index estimates of the GeoMet model. The focus here lies on the ball mill since it usually is the single largest energy consumer at the mine site. The spatial Work Index estimates are used to predict a ball mill’s throughput. In order to maximize mill throughput and optimize energy utilization, it is important to get the Work Index estimates right. At the Tropicana Gold Mine, Work Index estimates, derived from X-Ray Fluorescence and Hyperspectral scanning of grade control samples, are used to construct spatial GeoMetallurgical models (GeoMet). Inaccuracies in the block estimates exist due to limited calibration between grade control derived and laboratory Work Index values. To improve the calibration, the updating algorithm was tested at the mine during a pilot study. Deviations between predicted and actual mill performance are monitored and used to locally improve the Work Index estimates in the GeoMet model. While assim- ilating about a week of mill performance data, the spatial GeoMet model converged towards a previously unknown reality. The updating algorithm improved the spatial Work Index estimates, resulting in a real-time reconciliation of already extracted blocks and a recalibration of future scheduled blocks. The case study shows that historic and future production estimates improve on average by about 72% and 26%.
Original languageEnglish
QualificationDoctor of Philosophy
Awarding Institution
  • Delft University of Technology
Supervisors/Advisors
  • Jansen, J.D., Supervisor
  • Benndorf, Jörg, Supervisor
Award date19 Mar 2018
Print ISBNs978-94-6186-904-3
DOIs
Publication statusPublished - 2018

Keywords

  • Geostatistics
  • Data Assimilation
  • geometallurgy
  • resource engineering
  • mining
  • Discrete event simulation
  • material tracking

Fingerprint

Dive into the research topics of 'Data assimilation in the minerals industry: Real-time updating of spatial models using online production data'. Together they form a unique fingerprint.

Cite this