Impact case study database
- Submitting institution
- University of Durham
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Technological
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
IBEX Innovations is an X-ray imaging technology firm “dedicated to developing innovative X-ray imaging solutions since 2014“. Durham University collaborated with IBEX Innovations to develop methods to extend their technology portfolio and underpin their software only solutions ‘Trueview®’ and their patented multi-absorption-plate (MAP) technology. The project showed tangible benefits in several applications which enabled IBEX to develop products in quality control, security and, most importantly, medical markets. They have raised GBP6,000,000 in funding for product development and have secured 14 Jobs in the North East of England. Their technology is now incorporated into CurveBeam, is being incorporated into Planmed products and further agreements with other Original Equipment Manufacturers (OEMs) are in development.
2. Underpinning research
Bayesian statistics is a powerful tool to produce novel technology in industry. In this project, a number of approaches developed at Durham University were used for applications in the X-ray industry. The methods used are best split into two areas of Durham’s research: i) Bayes linear methods [ R1], which were used to extract material information from IBEX’s patented multi absorption plate (MAP) technology and develop an anomaly detection algorithm, and ii) their application to the analysis of complex computer models [ R2-R5], which were used to a) build fast approximations (emulators) of an expensive simulator, b) solve the inverse problem of finding a patient’s material composition from an X-ray image and c) make inferences in the real world using complex computer models to improve image quality and infer a patient’s material composition.
It is often of interest to make inferences about the material composition (thickness and alloy) of an object being examined using X-ray imaging technology. In conventional X-ray imaging approaches, two different materials can look identical in the X-ray image. For example, a thick piece of aluminium can look identical to a thin piece of lead, meaning the problem of material inference is ill-posed. The MAP, developed by IBEX, helps resolve this problem by adding multi-spectral information into the X-ray image. Durham University, in collaboration with IBEX, developed novel approaches which in conjunction with Durham research into Bayes linear statistics [ R1] and second order exchangeability [ R6] resulted in the contaminant detection method. This method enabled different materials (and alloys of two materials) to be differentiated at any thickness, greatly enhancing the applicability of the MAP by IBEX [ E9].
When X-rays travel through an object, they can either i) traverse the object unperturbed, ii) be absorbed or iii) scatter. Scatter is considered a form of spatially correlated noise which adds a deformed projection of the object onto the image, degrading image quality. To reduce scatter in clinical X-ray images, an anti-scatter-grid (ASG) is added. This engineering solution preferentially attenuates scattered photons reducing the scatter in an image. However, some unperturbed photons are also blocked meaning the dose to the patient is increased to compensate. A method, termed the scatter reassignment method (SRM), to mathematically remove scatter was developed using i) Bayes Linear emulation [ R3-R5], to match a simulation of an X-ray image to an actual real world X-ray image. The simulator used by IBEX took days to run. In contrast, our Bayes linear emulator approximates the simulator and enabled IBEX to make this process much faster, taking around 6 seconds to run. ii) Bayesian inverse problem solving [ R3-R5] which uses history matching using implausibility to say whether the simulated X-ray image matches the actual X-ray image. iii) Reified Bayesian analysis [ R2]. A simulator is only a model of the real world and there is always a difference between the simulator and the system it purports to represent. Reified Bayesian analysis is a method of coherently i) analysing this difference between a simulator and the real world, ii) reducing that difference and iii) including knowledge of how large any remaining difference is in the analysis. This was a big advantage for SRM as other companies using simulators to remove scatter do not consider the difference between the simulator and the real world at all. In less statistical terms, i) it enables the simulator to get much closer to the real world, ii) provides a measure of confidence in the accuracy of the approach and iii) we know when the match is good enough that there is no point searching for a better one. There were two key outputs from SRM: i) improved image quality over that of an ASG and ii) we showed that it was possible to make inferences about material composition using scattered photons. This means that better image quality can be returned and, in some cases, at a lower dose to the patient. X-ray images are used to calculate diagnostics measures, for example, areal-Bone-Mineral-Density (aBMD). This measure is used to assess whether a patient has osteoporosis. In current practice, patients at risk of osteoporosis are sent for a specialised scan. We showed that, by considering scatter rather than removing it with an ASG, it was possible to estimate aBMD using standard X-ray examination equipment. Expert opinion was that the problem was ill-posed and for that reason it could not be solved. When scatter was considered, the problem was not in the Bayesian sense ill-posed and, therefore, aBMD could be estimated.
3. References to the research
R1. Goldstein, M., & Wooff, D. (2007). Bayes linear statistics: Theory and methods (Vol. 716). John Wiley & Sons. [ https://doi.org/10.1002/pst.328]
Comment: The original research contained in R1 was supported by a number of EPSRC grants, all of which were highly graded in final review.
R2. Goldstein, M., & Rougier, J. (2009). Reified Bayesian modelling and inference for physical systems. Journal of Statistical Planning and Inference, 139(3), 1221-1239. [DOI: 10.1016/j.jspi.2008.07.019]
Comment: This was chosen as the first ever discussion paper in this journal.
R3. Goldstein, M., & Rougier, J. (2006). Bayes linear calibrated prediction for complex systems. Journal of the American Statistical Association, 101(475), 1132-1143. [ https://doi.org/10.1198/016214506000000203]
R4. Vernon, I., Goldstein, M., & Bower, R. G. (2010). Galaxy formation: a Bayesian uncertainty analysis. Bayesian analysis, 5(4), 619-669. [DOI:10.1214/10-BA524]
Comment: This paper extended history matching techniques for general heavy simulators in multiple directions and demonstrated these improvements on a large and complex model of Galaxy formation. Awarded the major worldwide prize in Bayesian statistics: the Mitchell Prize jointly awarded by the American Statistical Association and the International Society for Bayesian Analysis.
R5. Caiado, C. C. S., & Goldstein, M. (2015). Bayesian uncertainty analysis for complex physical systems modelled by computer simulators with applications to tipping points. Communications in Nonlinear Science and Numerical Simulation, 26(1-3), 123-136. [ http://dx.doi.org/10.1016/j.cnsns.2015.02.006]
R6. Rougier, J., Goldtstein, M. & House, L. (2013) Second-Order Exchangeability Analysis for Multimodel Ensembles, Journal of the American Statistical Association, 108:503, 852-863, [ https://doi.org/10.1080/01621459.2013.802963]
4. Details of the impact
IBEX Innovations Limited was created in 2010 to develop and commercialise an innovative X-ray detector technology. It has been collaborating with Durham since 2015 on Bayesian approaches to improving X-ray imaging [ E3]. The collaboration has greatly enhanced the commercial capability of the patented technology and ‘ allowed a small North East company to compete on a world stage’ [ E3, E4, E5]. Specifically, Durham research underpins their patented multi absorption plate (MAP) technology and their Trueview® product which uses the Scatter Reassignment Method (SRM) [ E9]. IBEX has raised almost GBP6,000,000 in funds for product development and to secure international customers [ E2], securing a total of 14 Jobs in the North East of England (gross value added for full time equivalent jobs to the North East is GBP662,144) [ E3, E4].
The MAP technology developed by IBEX in collaboration with Durham University enabled inferences, not possible with standard X-ray methods, about the material composition of an object being imaged with the MAP equipped detector [ E3]. A core technology at IBEX, it is at the base of a number of their products as outlined below.
- The first product developed was for security, which enabled portable X\-ray examination equipment capable of extracting accurate information about the materials being imaged. Portable X\-ray examination systems are key for safety when objects being examined cannot be moved. For example, one may want to X\-ray image a suspicious bag in an airport without moving it. IBEX signed a Joint Development Agreement \(JDA\) with 3DX\-ray, \(the main trading subsidiary of Image Scan Holdings plc\), to incorporate IBEX technology into 3DX\-ray’s next product on an Innovate UK assisted project. \[ **E1b, E3**\]. JDAs give a route to market for IBEX’s technology and increase their revenues.
- IBEX are also combining the MAP technology with their food inspection products. Recently, chocolate bar manufacturers have had to recall bars due to possible plastic contamination, therefore, an accurate plastic contamination system could save chocolate manufacturers the cost of recalling products and the negative media publicity associated with it. IBEX signed a JDA with Mettler\-Toledo, a manufacturer of precision instruments, to develop a methodology for i\) plastic contamination detection and ii\) bone in chicken breast detection, on an Innovate UK assisted project \(IBEX GBP109,163, Mettler\-Toledo GBP24,775\) \[ **E6**\]. IBEX expect this product to generate revenues of GBP1,000,000 per annum and the JDA has already secured 3 jobs within the company \[ **E4**\].
In addition to the research related to the MAP, our research supported the development of the SRM which is core IP at IBEX and forms the basis of their software patented product ‘Trueview®’ [ E3]. SRM offers benefits of i) “really good image detail”, ii) with “no additional equipment to reduce scatter, reducing the number of radiographs taken” (Nicola Hind, Consultant Radiographer, Royal Victoria Infirmary, Newcastle) [ E1] and iii) improved diagnostic measures in medical X-ray images. IBEX obtained a software patent to protect the novel approaches developed in collaboration with Durham University (WO2016051212A1, GB2563115A) [ E5]. The proof-of-concept helped IBEX receive numerous grants and venture capital funding to market their technology [ E1-E4, E6, E7]. Current impact around Trueview® includes:
The Trueview® software has undergone clinical trial at The James Cook University Hospital in Middlesbrough [ E8]. It measures a patient’s bone health from a standard X-ray and initial trial results indicate that it provides an accurate early warning of osteoporosis, and therefore a patient’s risk of potentially fatal fragility fractures. The study which was led by the orthopaedic research team, collaborating with colleagues in radiology and rheumatology, “compared images from the new software to results from 130 patients who had attended appointments for a DEXA bone scan at James Cook with positive results” [ E8]. IBEX Trueview® offers early detection, early intervention and better outcomes. Poor bone health represents a substantial global healthcare challenge. In the UK, the NHS spends around £1bn per annum on the diagnosis and treatment of hip fractures, but sadly it remains the largest single cause of accident-related death. IBEX Trueview® provides opportunistic early screening for bone health of all patients referred for a Digital Radiography scan, enabling timely detection of early stage osteoporosis before debilitating hip fractures occur.
IBEX have signed an agreement with CurveBeam to integrate the Trueview® product into their products [ E2, E3]. Leading cone-beam computed tomography systems (CBCT) manufacturer, CurveBeam LLC have chosen Trueview® to deliver unrivalled image quality for their weight-bearing CT products, including the new flagship HiRise system. CurveBeam President, Arun Singh, says that it produces “the best bilateral foot renders I have ever seen” [ E10a]. CurveBeam specialises in weight bearing CT imaging [E10b]. The Curvebeam HiRise system has attained FDA approval with Trueview® incorporated so will be on sale shortly. Two HiRise systems were installed for investigational studies prior to FDA clearance at University of Iowa Carver School of Medicine’s Department of Orthopedics and Rehabilitation in Iowa City, IA and at Tennessee Orthopaedic Clinics (TOC) in Knoxville, TN. Researchers at the University of Iowa have used HiRise for multiple research studies, including examining hip dysplasia infunctional position and evaluating wrist injuries in gymnasts. “The HiRise promises to revolutionize our biomechanical understanding of the entire lower extremity the same way previous generations of CurveBeam’s weight bearing CT systems enabled better investigation into the foot, ankle and knee,” said Dr. Cesar de Cesar Netto, MD, PhD, Assistant Professor of Orthopedics and Rehabilitation at University of Iowa [ E10c].
IBEX have signed a licensing agreement to integrate Trueview® into Planmed’s Clarity 2D, Clarity 3D and Clarity S mammography systems [ E2, E7]. The company is part of Finnish Planmeca Group, a well-known company in the medical and dental field which provides tools for healthcare professionals in over 80 countries worldwide. The target release for this product has been delayed by Covid-19. Trueview® enables better image quality, improved workflow and better patient outcomes. It has been demonstrated to give better contrast in 2D and 3D mammograms – with up to 50% lower patient dose and up to 25% less breast compression [ E7]. Of the collaboration Mr Jan Moed, Managing director of Planmed says “We are constantly listening to our customers and developing our product lines accordingly. We feel that this collaboration will bring added value, not only for the users but ultimately for the patients which is what really matters,” [ E7].
A grant has been obtained by IBEX to see how Trueview® can improve AI algorithms. This resulted in one new hire. The grant number is 44808 and the award was for GBP288,136 and a continuity grant 74463 was obtained for GBP100,000. [ E6].
In summary, Durham research underpins the technology that is enabling IBEX to grow as a company, receive funding and take products to market [ E3, E4]. Ian Wilson, who leads Mercia’s team in the North East of England, which provided funding via the North East Venture Fund to IBEX, says: “IBEX is the only one in the market that can achieve this quality of images. The company continues to make progress in both developing new products and new commercial relationships. Mercia is delighted to support IBEX once again and help roll out this new system which will be an important step forward in breast cancer detection.’’ [ E2].
5. Sources to corroborate the impact
[E1] A. collection of IBEX announcements B. Image Scan Holdings - https://www.research-tree.com/newsfeed/ArticleOffset?articleSeoTitle=preliminary-results-608859&page=3
[E2] Medical technology firm IBEX brings in more than GBP500,000 of new investment - https://www.business-live.co.uk/technology/medical-technology-firm-ibex-brings-18256755
[E3] Letter from Chief Technical Officer IBEX Innovations
[E4] Letter from Chief Scientific Officer IBEX Innovations
[E5] IBEX's MAP technology is covered by UK patents: GB2498615, GB2532634, GB2532897, GB2533233, GB2563115A, WO2016051212A1.
[E6] List of Ibex grants
[E7] Planmed and IBEX Innovations sign a collaboration agreement - https://www.planmed.com/press/news-main-page/planmed-and-ibex-innovations-sign-a-collaboration-agreement/
[E8] A. Talking point, Autumn 2020 discussing Trueview clinical trial success - https://www.southtees.nhs.uk/content/uploads/Talking-Point-Autumn-2020-web-3.pdf B. Details on the clinical trial results - https://www.ibexmedical.co.uk/wp-content/uploads/2020/07/IBEX-Trueview-Bone-Health-July-2020.pdf
[E9] IBEX white paper, pages 22 and 30 show acknowledgements to Durham University collaboration
[E10] A. The New Standard For CBCT - https://www.ibexmedical.co.uk/1817-2/ and B. Curvebeam website - https://curvebeam.com/about/curvebeam/ and C. FDA 510(K) clearance ( https://curvebeam.com/news/curvebeam-announces-fda-510k-clearance-for-hirise-weight-bearing-ct-system-for-the-entire-lower-extremity/)
- Submitting institution
- University of Durham
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Economic
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
Durham University worked in collaboration with Atom Bank to develop an end-to-end banking model based on mathematical and statistical methods developed at Durham. The model provides the business with a strategic understanding of the relationships between the bank’s key components including customers and products and supports major business decisions such as financial planning, resourcing, product pricing and funding. The impacts of this on Atom Bank include improved management of short term liquidity requirements, improved capital deployment and operational risk reduction. Direct savings to the bank are estimated at [REDACTED] or more annually since 2018 and these savings are passed on to the bank’s customers.
2. Underpinning research
Mathematical models, implemented as high-dimensional computer simulators, are often used to study complex systems. In general terms, a bank is a complex system with multiple interacting components driven by statistical and financial models that are used to predict future market behaviour and optimise scenarios. Banking models need to be constructed so that they can interact with each other seamlessly and so that they can be easily used to predict behaviours (financial market, customer interactions), optimise strategies (investments, funding capture, pricing) and manage risk (operational, risk appetite). Based on the Bayes Linear methodology and uncertainty quantification framework [R1-R3], Atom and Durham University developed a number of critical components necessary for financial operations and decision support solutions used by stakeholders across the business to understand the impact of their choices on the overall business leading to a new interactive and innovative approach to banking.
Durham research on complex simulations, statistical modelling, and optimisation [R2-R4], was used to mathematically model and optimise the relationship between investments, customer behaviour, market behaviour, and their constraints and impacts. It resulted in a “simulator” of the bank, or the “Atom Digital Twin”. Advanced uncertainty analysis constructed a statistical model including an optimisation tool and user interfaces for decision support in line with the work in R1 and R3. In particular, a Bayes Linear approach [R1] was used to construct emulators of some of the sub-models of the financial and operational master models of the bank, and history matching was used to identify areas of the models’ input spaces (formed by a number of variables the bank can normally control such as prices and staffing).
One of the tools delivered is a funding optimiser that identifies the potential price combinations that can be set in order to minimise future cost; for example, if there were 4 fixed-term mortgage products with different maturity lengths that have their prices updated weekly within certain constrained price ranges. As in R1-R3, we started by building a Bayes Linear emulator of one of the mortgage pricing models. The objective is to find combinations of prices that minimise costs (and maximise long-term profit where possible) over a period of 5 years; with prices varying weekly for each product. In this context, it is necessary to identify suitable time series solutions (length 260 for each product) that are feasible individually but also as a group to ensure capital requirements can be met each week. In R5, the process of sensitivity analysis for stochastic dynamical systems and how to investigate the impact of different inputs over time was discussed. Since optimising for 1,040 inputs simultaneously in a reasonable amount of time wasn’t feasible, the same approach was used as in R5, to identify the periods in time where price variations were most likely to cause large changes to the output or drive the models into infeasibility. This approach was very effective in reducing the input space and narrowing down price ranges for each product but also to allow the decision makers to see alternative pricing profiles that could lead to a similar (if not the same) outcome that they were used to.
At a given point in time, the bank has the responsibility to retain the correct amount to be paid to customers as their mortgages come to an end, but also has to hold certain amounts of cash to ensure business continuity. However, holding on to money without investing it is costly, so they need to balance out investments and regulatory demands to minimise cost. First, they need to understand how much business to expect if they are to set a specific price for a given product, subject to potential competitor behaviour and other external factors, to derive a price elasticity model of demand for each product. Given a certain demand level, estimated using agent-based modelling principles in R4, a retention profile has to be derived and the minimum cash requirements updated. In R4, the research looked at how customers behave with respect to peer pressure, innovation, product loyalty and memory. For a new bank, loyalty doesn’t yet exist and memory needs to be built as part of the retention process. The Durham research looked at the trade-offs between parameters such as for innovation and memory for different customer groups at Atom to estimate retention rates. These values have then been subsequently updated based on real data as products matured.
Even though the individual models mentioned have been optimised to run in real-time (<10ms), it is necessary to identify the regions in the input space for which cost is minimized; here the methodology outlined in R3 and R5 was used. At the moment, this takes around 90s per planning year reducing severely the time taken by this process which used to take hours and required constant manual intervention; a combination of Bayesian optimisation techniques for space reduction, and Bayes Linear methods were used. Now this is enveloped in a user interface that can be used to stress test price interactions, inform treasury of needed funding capture, investigate marketing strategies, and optimise for the yearly 7-year financial planning round.
3. References to the research
[R1] Goldstein, M., & Wooff, D. (2007). Bayes linear statistics: Theory and methods (Vol. 716). John Wiley & Sons. [https://doi.org/10.1002/pst.328\]
[R2] Goldstein, M., & Huntley, N. (2016). Bayes Linear Emulation, History Matching, and Forecasting for Complex Computer Simulators. Handbook of Uncertainty Quantification, 1-24. [DOI 10.1007/978-3-319-11259-6_14-1]
[R3] Caiado, C. C. S., & Goldstein, M. (2015). Bayesian uncertainty analysis for complex physical systems modelled by computer simulators with applications to tipping points. Communications in Nonlinear Science and Numerical Simulation, 26(1-3), 123-136. [http://dx.doi.org/10.1016/j.cnsns.2015.02.006\]
[R4] Bentley, R.A., Caiado, C.C.S. & Ormerod, P. (2014). Effects of memory on spatial heterogeneity in neutrally transmitted culture. Evolution and Human Behavior 35(4): 257-263. [ https://doi.org/10.1016/j.evolhumbehav.2014.02.001]
[R5] Bissell, J. J., Caiado, C. C. S., Goldstein, M., & Straughan, B. (2014). Compartmental modelling of social dynamics with generalised peer incidence. Mathematical Models and
Methods in Applied Sciences, 24(04), 719-750. [http://dx.doi.org/10.1142/S0218202513500656\]
The original research contained in: R1 was supported by a number of EPSRC grants, all of which were highly graded in final review; R2 was supported by the NERC funded Consortium ‘CREDIBLE’; R3-R5 was supported by the Leverhulme funded Tipping Points project.
4. Details of the impact
**
Atom is a digital challenger bank based in the City of Durham, which started trading in 2016, offering an innovative and efficient mobile-only app-based personal and business banking experience. One of the reasons Atom chose Durham as its base was to take advantage of the research output of the University. *‘It quickly became apparent that the department’s specialisation in applying Bayesian techniques to financial and other systems [e.g. **[R3, R4]*] offered the opportunity to create a digital twin of some of the bank’s key allocation decisions.’ [E1]. The collaboration between Atom bank and Durham University resulted in the creation of the Atom Bank Digital Twin using methods [ R1-R5] developed within the Mathematical Sciences department. This has allowed Atom Bank to re-invent financial planning in financial services enabling it to 1. Optimise business activity so that it generates the maximum return from each set of investment decisions and 2. Understand in full the consequential impacts of any business decision on each business area. This enables the bank to link its resource planning accurately to product and pricing decisions [E2]. Atom says that this was ‘critical to building a competitive advantage that can be shared between investors and customers’ [E1]. The total estimated savings, detailed below, are currently [REDACTED].
The Atom Bank Digital Twin is a novel decision support suite, which is being used to support major business decisions such as financial planning, resourcing, product pricing and funding. Atom Bank now has the unique opportunity to mathematically model the organisation and manage the business according to the inputs and outputs of that model. The direct impact of the collaboration with Durham University (which originally centred around a Knowledge Transfer Partnership) on Atom's financial performance can be summarised in three categories as taken from E1:
- Improved management of short-term liquidity requirements [E1]
Banks are required by regulation to hold sufficient liquid assets to cover their forthcoming cash flows. Many aspects of inflows and outflows, such as the quantity of loans being provided to customers, and the savings deposits received by customers, are uncertain. The models created with Durham research [R1-5] have greatly increased the accuracy of Atom's ability to forecast these cash flows and so reduce the overall liquid assets the bank is obliged to hold (which are an overall cost to the business). The main items are:
Improved modelling of loan completions allowed the liquidity held against future loan completions to be reduced by approximately [REDACTED] at all times, leading to around [REDACTED] reduction on average in liquid assets held, with an annualised cost reduction of around [REDACTED]. This benefit was first fully realised in 2018 and a similar annual benefit remains today.
Improved ability to predict the behaviour of savings customers has allowed Atom to reduce the volume of liquid assets needed to ensure that deposit holders can access their funds upon maturity. It is estimated that around [REDACTED] lower liquid assets were needed in 2018 and 2019, leading to a reduction in cost of liquidity of around [REDACTED] in each year. In 2020, a different business strategy meant that the modelling reduced the risk of accidental breaches of regulatory limits, which cannot easily be measured in terms of financial value, but is of critical reputational importance.
Advanced uncertainty analysis (R) on the above models allowed the business to reduce buffers (designed to minimise the risk of regulatory breaches) on liquidity, leading to another [REDACTED] of reduced liquidity requirements in 2019. Again, in 2020, this methodology has led to risk reduction rather than financial improvements.
Banks, particularly fast-growing banks such as Atom, are often constrained by available capital. They are obliged to hold certain levels of capital for each loan sold. Therefore, maximising the efficiency of this limited resource is key to profitability. Working with Durham University has principally provided benefits in the following areas:
Quality assurance work carried out on Atom's internal lending models is estimated to have led to at least a [REDACTED] improvement on margin. This equates to around [REDACTED] per year on loans originated in 2019.
Improved long-term financial planning from the summer of 2017 onwards will have led to a more efficient business plan. This benefit is significantly harder to estimate than any of the others, but financial planning experts quantified this as at least [REDACTED] improvement on the net interest margin of the entire book, which equates to around [REDACTED] per year based on Atom's current book.
Another use of a bank's capital is to be held against risks to implementing the business plan successfully. Collaborating with Durham University meant a number of operational risks were reduced. The highlights were:
Quality assurance on regulatory mortgage calculations.
Reduction in model risk, i.e. the risk that Atom's internal models give erroneous or misleading forecasts.
[REDACTED]
Embedding the use of the Atom Bank Digital Twin has had a substantial impact on processes and policy within Atom Bank. Atom’s data and analytics team has expanded from [REDACTED] employed in the North East of England to deliver this project and its extensions [E2] (gross value added based on North East 2017 figure is **[REDACTED]**) and a total of 30 staff have been trained in using the model output as well as in the R language to allow model development and maintenance to continue [E2]. By embedding expertise in the open-source R programming language, the business has been able to avoid costly licensing for commercial data analysis software. Based on quotes provided to Atom, the impact of this is a saving of around [REDACTED] per year [E2]. Atom says that the collaboration ‘ has also shifted expectations and perceptions amongst the senior team at Atom, many of whom had not had or seen the benefit of deep engagement with university researchers in their previous roles’ [E1]. Atom Bank have continued to win industry recognition and have recently been included in the Tech Nation Future Fifty business community [E5]. They continue to increase the number of products offered and [REDACTED] [E6].
Impact on customers of Atom Bank
Atom bank is a retail business and so every efficiency improvement also has a benefit to customers. ‘ *Reducing the friction between teams, increasing the allocative efficiency of our capital, and responding to the constraints on optimisation that are a necessary part of being a regulated bank all mean that customers get better prices for their savings and lower costs for their lending than we could otherwise afford to offer. Atom continues to be recognised for its excellence in the savings and lending markets’ [E1] (as recognised by various media organisations **[E3]**) *‘and behind all of these sits an engine that is refined and tuned by the work done with Dr Caiado and colleagues.*’ [E1]. Atom is Trustpilot’s most trusted UK bank and consistently achieves Net Promoter Scores in excess of +75 [ E4, 2020 annual report, page 15].
In summary Durham University have helped to develop an end-to-end banking model that provides a live and interactive overview of the business with a strategic understanding of the relationships between the bank’s key components including customers and products. This is unique within the banking sector and the change in practice has given Atom Bank the opportunity to update both the simulation and the optimisation in real time as the Bank evolves and grows, increasing sustainability and providing rigorous calculations supporting the mortgage lending process that minimise the risk of operational damage or regulatory breaches from calculation error. All these benefits are passed on to the customers as savings.
5. Sources to corroborate the impact
[E1] Letter from Chief Technology Officer and Founder (November 2020)
[E2] KTP Final report (July 2019)
[E3] Atom in the media, for example The Times and The Northern Echo.
[E4] Atom Bank Annual reports (2015/16, 2016/17, 2017/18, 2018/19, 2019/20)
[E5] TechNation Future 50 - https://technation.io/news/future-fifty-cohort-2020/
[E6] Total assets growth from 2015-2019 - https://thebanks.eu/banks/18618
- Submitting institution
- University of Durham
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Technological
- Is this case study continued from a case study submitted in 2014?
- Yes
1. Summary of the impact
ENABLE is an emulation, history matching and uncertainty assessment software system for the oil industry, sold by Roxar/Emerson, whose inference engine was produced and subsequently improved by the Durham Statistics group, based on their research on uncertainty quantification for complex physical systems modelled by computer simulators. ENABLE optimizes oil asset management plans by careful uncertainty quantification and reduces development costs by accelerating the history matching process for oil reservoirs, resulting in vastly improved technical and economic decision-making. This led to substantial and sustained impact (a) for Roxar/Emerson as annual sales of ENABLE are approximately USD2,300,000, and (b) for their oil company clients, as ENABLE enjoys global usage by more than 16 major oil companies, including [REDACTED].
2. Underpinning research
The Durham Statistics Group has a long track record of work on the quantification of uncertainty for large and complex physical systems modelled by computer simulators. Example areas include climate, cosmology, epidemiology, systems biology and history matching for oil reservoirs, the subject of the current impact case study. This problem may be described as follows. Reservoir simulators are key tools to help oil companies manage production for oil reservoirs. The simulator takes as inputs a description of the reservoir (rock properties, fault distribution and so on) and returns as outputs the well performance (pressure profiles, production, water cut and so forth). As the appropriate input choices are not known a priori, the input space must be searched to identify choices of reservoir specification for which the output of the simulator at the wells corresponds, to an acceptable degree, to recorded historical behaviour. This process is termed history matching. It is difficult and challenging because the input and output spaces are high dimensional, the evaluation of the simulator for a single choice of inputs takes many hours, and there are multiple additional sources of uncertainty that must be included to make the analysis meaningful.
The Durham group devised a detailed Bayesian solution to this problem, based on building an emulator for the simulator. This is a probabilistic surrogate for the simulator, giving both a fast approximation to the simulator and a measure of uncertainty related to the quality of the approximation. This emulator, in combination with an uncertainty representation for the difference between the simulator and the reservoir, forms the basis of the history matching methodology that we developed. This proceeds by eliminating those parts of the input space for which emulated outputs were too far from the observed history, according to a collection of appropriate implausibility measures, then re-sampling and re-emulating the simulator within the reduced space, eliminating further parts of the input space and continuing in this fashion. This is a form of iterative global search aimed at finding all of the input regions containing good matches to the history.
Building on initial progress made in 1993-1997 which was the focus of a previous impact case study, the Durham statistics group have, since 2008, developed award winning (Mitchell Prize for Best Applied Bayesian journal article worldwide, 2010) advances to multiple components of the history matching process for general simulators [R1]. These include improvements to the Bayes linear emulator construction process in terms of more advanced active variable selection, selection of polynomial forms specifically appropriate for the history matching requirements, choice of emulator correlation function and appropriate robust assessment of emulator correlation parameters, nugget estimation, Bayes linear emulator and implausibility diagnostics appropriate for history matching and design in multilevel contexts [R2-R3], further advances in Bayes linear model discrepancy assessment and representation, and the generalisation of Bayes linear history matching to stochastic models [R5]. All these advances, published in a series of papers [R1-R5], allow significantly more efficient and general history matching algorithms to be built and have been applied successfully in multiple diverse and challenging scientific contexts e.g. cosmology, epidemiology and oil [R1-R5]. These advances were subsequently implemented in the Roxar/Emerson Tempest-ENABLE package by Ian Vernon (Durham) and the Roxar/Emerson team, over the period 2012 to date, as part of a long running consultancy contract between Roxar/Emerson and Durham University Statistics group [E9]. This resulted in “an order of magnitude improvement in performance” [E1] in terms of the speed and accuracy of the ENABLE software, which allowed it to maintain its competitiveness and profitability in a rapidly evolving market, and its reputation as state-of-the-art history matching and uncertainty assessment software. Roxar/Emerson write (see **[E1]**):
“The runtime performance of Tempest ENABLE is critical to the commercial viability of the product. During the period 2012 – 2016 many improvements were made to the underlying algorithms in Tempest ENABLE which brought an order of magnitude improvement in performance…These improvements were based on Bayes linear emulation and history matching methodologies developed in the following papers by Durham [R1-R4] *. These advances greatly improved the quality of fits of the emulator resulting in substantial improvements in history matching.”…“Our collaboration with Durham has allowed us to develop superior implementations of Bayes linear emulators when stochastic forward models are in use [R5]. This feature is currently unique within the industry.”*
3. References to the research
[R1] Vernon, Ian., Goldstein, Michael. & Bower, Richard G. (2010). Galaxy Formation: a Bayesian Uncertainty Analysis. Bayesian Analysis 05(04): 619 - 670 (with discussion). DOI: 10.1214/10-BA524
Comment: This paper extended history matching techniques for general heavy simulators in multiple directions and demonstrated these improvements on a large and complex model of Galaxy formation. Awarded the major worldwide prize in Bayesian statistics: the Mitchell Prize jointly awarded by the American Statistical Association and the International Society for Bayesian Analysis. The transferable techniques developed in this work were directly implemented in ENABLE, and several other scientific disciplines.
[R2] Cumming, J. A. & Goldstein, M. (2009). Small Sample Bayesian Designs for Complex High-Dimensional Models Based on Information Gained Using Fast Approximations. Technometrics 51(4): 377-388. DOI: 10.1198/TECH.2009.08015
Comment: Developed and implemented sophisticated design strategies for multilevel emulators, and strategies for efficient output selection, in a history matching context, and applied them to oil reservoirs.
[R3] Cumming, J. A. & Goldstein, M. (2010). Bayes linear Uncertainty Analysis for Oil Reservoirs Based on Multiscale Computer Experiments. In The Oxford Handbook of Applied Bayesian Analysis. O'Hagan, A. & West, M. Oxford: Oxford University Press. 241-270.
Comment: Developed further and implemented sophisticated design strategies for multilevel emulators, and strategies for efficient output selection, in a history matching context, and applied them to oil reservoirs. Uploaded into Underpinning Research folder.
[R4] Vernon, I., Goldstein, M., Bower, R.G. (2014). Galaxy formation: Bayesian history matching for the observable universe. Statistical Science 29(1), 81–90. DOI: 10.1214/12-STS412
Comment: Invited paper for statistics journal on “ Bayesian Success Stories”, summarising recent advances to history matching for general simulators of complex physical systems.
[R5] Andrianakis, Ioannis, Vernon, Ian, McCreesh, N., McKinley, T.J., Oakley, J.E., Nsubuga, R., Goldstein, M. & White, R.G. (2015). Bayesian history matching of complex infectious disease models using emulation: A tutorial and a case study on HIV in Uganda. PLoS Comput Biol 11(1): e1003968. DOI: 10.1371/journal.pcbi.1003968
Comment: The extension of Bayes linear history matching methodology to stochastic models, and an example of the transferability of the techniques. Applied to analyse a complex stochastic model of epidemiology (HIV in Uganda). Techniques implemented in Enable [E1].
4. Details of the impact
.
Roxar AS, which owns ENABLE, is an international provider of products and associated services for reservoir management and production optimisation in the upstream oil and gas industry. Working with the Durham Statistics Group has helped Roxar to develop a more successful product, leading to substantial commercial success (see table below). It has also helped Roxar secure long-term technical jobs in its development team and delivered significant impact for its customers. The company is committed to both its support for ENABLE as its flagship product and its continuing work with Durham Statistics Group.
This is how Roxar currently describe the role of ENABLE [E4, E5]:
“ENABLE provides mathematical support to reservoir engineers in their use of reservoir simulation software. This support allows engineers to complete tasks like history matching much more quickly than using the simulator on its own and also provides a more rigorous approach to predicting future reservoir performance or optimising field development.”
Roxar is headquartered in Stavanger, Norway and operates in 19 countries with around 900 employees. Roxar offers software for reservoir interpretation, modelling and simulation, as well as instrumentation for well planning, monitoring and metering. Roxar was acquired by Emerson Electric Company in April 2009 and is now part of the Emerson Process Management Group.
The ENABLE product has been very successful. The current list of ENABLE clients includes [E1]: [REDACTED].
This has led to substantial and sustained financial impact. Roxar/Emerson report [E1] the following total Tempest ENABLE annual sales figures in USD for 2014 to 2019. Results are reported to Emerson’s financial year which runs from 1st October to 30th September.
2014 | [REDACTED] |
---|---|
2015 | [REDACTED] |
2016 | [REDACTED] |
2017 | [REDACTED] |
2018 | [REDACTED] |
2019 | [REDACTED] |
The ENABLE product has also maintained several long term Roxar employment positions [E1]:
*“During the period 2014 to date the core Tempest ENABLE development team in the UK is 6 highly technical people (5 PhDs). Additional sales, services and support jobs are supported by ENABLE’s success.”*
Roxar remains committed to the continuation of their support for ENABLE, as evidenced by them integrating it within their flagship TEMPEST simulator package now called TEMPEST-ENABLE, the development of which was part funded by STATOIL [E3, E6, E8].
The implementation within ENABLE of more advanced algorithms resulting from recent research work, by the Durham statistics group, from 2012 onward has led to substantial improvements in the code performance, and kept the ENABLE software highly efficient and competitive in an aggressive market since Oct 2014. The Roxar team have written that [E1]:
“The runtime performance of Tempest ENABLE is critical to the commercial viability of the product. During the period 2012 – 2016 many improvements were made to the underlying algorithms in Tempest ENABLE which brought an order of magnitude improvement in performance. These improvements included more advanced Bayes linear emulator construction strategies, improved experimental and sequential designs, robust correlation length estimation and robust active inputs selection. New Bayes linear emulator diagnostics were introduced which demonstrated, in a visual manner, the improvements in the algorithms. These improvements were based on Bayes linear emulation and history matching methodologies developed in the following papers by Durham [R1-R4] . These advances greatly improved the quality of fits of the emulator resulting in substantial improvements in history matching.”
Roxar are committed to continuing their relationship with Durham University Statistics Group, funding three consecutive consultancy contracts [E9]: June 2012 – May 2015 (GBP135,000), June 2015 – May 2018 (GBP150,000), June 2018 – May 2021 (GBP157,200), to fund Dr Vernon to help implement improvements to ENABLE. They have also funded an iCase studentship (Oct 2018 – Sept 2022, GBP55,000 in addition to full EPSRC contribution) to develop further impact related statistical methodology [E1].
ENABLE facilitates a detailed analysis of oil reservoirs including the quantification of uncertainties vital for forecasting, decision making and hence managing oil assets [ E4-E8], and therefore has substantial impact on the oil companies who use it. Examples are [E1]:
“Tempest ENABLE is also a key piece of Big Loop and was instrumental in avoiding a potential loss of $5M in renewal fees at a large international oil company. Big Loop is gaining traction in the market and is being successfully adopted at major national and international oil companies.” (Big Loop combines geological and reservoir models in the inferential framework, but again uses Tempest-Enable at its core).
Another example is given in [E2], a 2017 press release of a successful long-term joint project by Statoil and Emerson, again using Tempest-Enable at its core. ‘‘The program focused on how E&P companies can improve history matching, uncertainty management and quantification across the entire reservoir characterization workflow through Roxar Tempest ENABLE, Emerson’s industry-leading history matching and uncertainty estimation software solution.’’
5. Sources to corroborate the impact
[E1] Letter from Robert Frost at Roxar/Emerson detailing, a) description of Enable’s place in market, b) list of press releases/brochures, c) evidence of use of Enable by many clients, d) link between DU research and product improvement, e) sales figures for Tempest-Enable 2012-2019.
[E2] 2017 press release re joint Statoil/Emerson project using Tempest-Enable as core: https://www.worldoil.com/news/2017/5/25/in-cooperation-with-statoil-emerson-announces-completion-of-total-uncertainty-management-program. (Site copy taken 14/9/20)
[E3] Emerson main page on the Tempest ENABLE software. Links at bottom of page to several data sheets and articles that describe ENABLE as a core part of the software. Link: https://www.emerson.com/en-us/catalog/roxar-tempest-enable. (Site copy taken 28/2/20)
[E4] Specific data sheet on Tempest ENABLE obtained from link in [E3] (on 5/04/19) that describes its strengths, and which specifically mentions Dr Ian Vernon, Durham University.
[E5] Folder containing several data sheets (taken on 5/04/19 from **[E3]**) detailing Tempest ENABLE (to evidence core part ENABLE plays in the software).
[E6] Emerson webpage detailing latest release of Tempest 8.2 (on 8/5/18) that contains ENABLE at its core (see the 2nd paragraph: “…Enhancements to the Big Loop workflow within Tempest ENABLE, Emerson’s uncertainty management and history matching module, center on…” Link: https://www.emerson.com/en-us/news/automation/1805-roxar-tempest-8-2. (Site copy taken 28/2/20)
[E7] Emerson Webpage detailing “Reservoir Engineering and Simulation” containing list of products including Tempest ENABLE. Link: https://www.emerson.com/en-us/automation/operations-business-management/reservoir-management-software/reservoir-engineering-and-simulation. (Site copy taken 28/2/20)
[E8] Press release regarding Tempest ENABLE and a case study article. “Press release re Tempest Enable and case study article” (taken on 5/04/19 from **[E3]**).
[E9] Consultancy contracts between Durham University and Roxar/Emerson from 2012-2021.
- Submitting institution
- University of Durham
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Environmental
- Is this case study continued from a case study submitted in 2014?
- Yes
1. Summary of the impact
The Climate Change Act, 2008, constructed a legally-binding long-term framework for building the UK’s ability to adapt to a changing climate. The Act requires a UK-wide climate change risk assessment (CCRA), setting out the Government’s policies for responding to the risks identified in the CCRA by means of a National Adaptation Programme. This programme draws heavily on the uncertainty analysis for future climate outcomes carried out by the Met Office in their UK Climate Projections 2009 and 2018, which both exploited fundamental research into the Bayesian analysis of uncertainty for physical systems modelled by computer simulators, as carried out at Durham University. These climate projections further impact on all those industries and public sector organisations which must address the uncertainties in future climate in order to make decisions on policy and investment.
2. Underpinning research
.
The Durham statistics group has developed a very general probabilistic framework for linking mathematical models to the physical systems that they purport to represent [ R1, R2]. This framework takes account of all sources of uncertainty, including model and simulator imperfections, and is a necessary precondition for making probabilistic statements about the system on the basis of historical observations and evaluations of the computer simulators. The formulation distinguishes simulators according to their quality and the nature of their inputs. Further modelling constructs are introduced to account for imperfections in the available simulators and to unify the composite inference, from the collection of available simulators, observed historical data and the judgements of experts, for the behaviour of the actual physical system. Fast surrogate forms, which approximate the computer model with an associated uncertainty assessment of the quality of the approximation, are integrated into this approach, both to address issues of evaluation speed for slow and expensive models and also to bridge important gaps arising from model imperfections. This research is part of an ongoing exploration at Durham University of the question as to what is the actual information about a physical system that is conveyed by one or more models for that system, and how can that information be uncovered and exploited for better understanding of the behaviour of the system. The work is quite general, but the group did have in mind the application to collections of evaluations of climate models in this development [ R2] and one aspect of the research is work which develops the underlying structure in such a way that it may be directly applied to such problems in climate science [ R3-R5]. This application of the research forms the core of the current impact case.
[ R1] is a key paper in the technical development of our approach, which constructs a general core methodology for forecasting real world system behaviour based on computer simulators, while addressing all of the uncertainties associated with such problems. [ R2] summarises, formalises and generalises all of the preceding work in Durham University, such as [ R1], on uncertainty in physical systems represented by computer models, adding sufficient structure that the approach can be directly applied in large scale problems such as climate science, as will be discussed in the impact section. [ R2] provided core ideas for much further development at Durham University. In particular, Dr Rougier, while at Durham University, wrote a follow-up paper [ R3] which applied its formulation directly to the problem of climate science and Goldstein and Rougier wrote the first draft of a general conceptual paper in this area which extended the formulation and eventually appeared as [ R4] . Finally, a summary of ongoing research in this area is provided in [ R5].
3. References to the research
R1. P. Craig, M. Goldstein, J. Rougier, and A. Seheult, (2001) Bayesian forecasting for complex systems using computer simulators, J. Amer. Statist. Assoc., 96 (2001), pp. 717–729. DOI: 10.2307/2670309
R2. Goldstein, M and Rougier, J (2004) Probabilistic formulations for transferring inferences from mathematical models to physical systems, Siam J. Sci Comput., 26, 467-487 http://dx.doi.org/10.1137/S106482750342670X
R3. Rougier, J. C. (2007). Probabilistic inference for future climate using an ensemble of climate model evaluations. Climatic Change, 81, 247–264 DOI: 10.1007/S10584-006-9156-9
R4. Goldstein M and Rougier JC (2009) Reified Bayesian modelling and inference for physical systems, Journal of Statistical Planning and Inference, 139, 1221-1239 https://doi.org/10.1016/j.jspi.2008.07.019
R5. Rougier JC and Goldstein M (2014) Climate Simulators and Climate Projections, Annual Review of Statistics and Its Application Vol. 1:103-123. https://doi.org/10.1146/annurev-statistics-022513-115652
R1-R4 are highly cited and influential original research articles. R1, R2 and R4 introduce the key concepts and develop the core methodology, including uncertainty analysis, for simulators of complex physical systems. R4 was chosen as the first ever discussion paper in that journal.
4. Details of the impact
.
The research at Durham University had a key role in the Met Office UK Climate Projections 2009 (UKCP09) and UK Climate Projections 2018 (UKCP18). These projections had a fundamental impact upon Government, public and private planning for adaptation to climate change. These impacts are as follows.
The UKCP18, an updated version of the UKCP09, is the Met Office’s climate analysis tool. It provides projections for UK and global climate change throughout the 21st century using the latest climate science and analysis. It is designed to assist decision-makers by enabling them to assess climate change risks and is used by both public and private users to plan for a wide spectrum of environmental impacts. UKCP09 provided authoritative climate forecasts which were widely used in climate adaptation planning, within a statutory legal framework that we will describe below, until they were replaced by the enhanced forecasts arising from UKCP18. Thus, they continued to have impact throughout the current REF impact period until the updated forecasts took over.
The science and methodology used to construct UKCP09 are described in the Met Office report [ E1] and in [ E2] for UKCP18. Each report emphasises the importance of the careful treatment of uncertainty in the climate projections. Here is an indicative quotation from [ E1]. "Uncertainty in climate change projections is a major problem for those planning to adapt to a changing climate. Adapting to a smaller change than that which actually occurs (or one of the wrong sign) could result in costly impacts and endanger lives, yet adapting to too large a change (or, again, one of the wrong sign), could waste money..... The 2008 projections are the first from UKCIP to be designed to treat uncertainties explicitly ... This means that probabilities are attached to different climate change outcomes, giving more information to planners and decision makers." [ E1, p19]
The methods developed at Durham University play an important role in the uncertainty analysis throughout each report and Durham University research is referred to explicitly in the UKCP09 and UKCP18 documents. For example, for UKCP18 "These results support the chosen approach, allowing Strand 1 to provide an updated product that combines evidence from HadCM3-based PPEs and CMIP5 models in a consistent manner, retaining the Bayesian statistical framework used for UKCP09 (Goldstein and Rougier, 2004) [ R2]." [ E2, p11] and "Prior pdfs ... are then produced using the Bayesian method of Sexton et al (2012) , based on the general framework of Goldstein and Rougier (2004) [ R2]." [ E2, p17]. Our contribution is amplified in Sexton et al (2012) [ E3], which explains in detail the uncertainty methodology used by UKCP09 and later by UKCP18. For example, in [ E3] section 3 “Here we describe the general steps in Goldstein and Rougier (2004) [ R2] necessary to determine a probability distribution of some aspects of climate change that we want to predict." while section 6.2 begins "...Goldstein and Rougier (2004) [ R2] gives us several key advantages. ...First, the multivariate nature of this probabilistic framework allows us to have more than one prediction variable. Predicting joint probabilities provides us with important information on how uncertainty is related across different climate variables....."
With the Climate Change Act (CCA) 2008, the UK constructed a legally-binding long-term framework to cut greenhouse gas emissions and a framework for building the UK’s ability to adapt to a changing climate [ E4]. The Act requires a UK-wide climate change risk assessment (CCRA) to take place every five years. The CCRA constructs sector reports describing a wide range of potential risks in each of the following sectors: Agriculture; Biodiversity & Ecosystem Services; Built Environment; Business, Industry & Services; Energy; Floods & Coastal Erosion; Forestry; Health; Marine & Fisheries; Transport; Water. The CCRA facilitates the Climate Change Act mandated National Adaptation Programme, (NAP) produced in 2013 [ E6] (covering the period of 2013 to 2018) and updated in 2018 [ E7]. The NAPs set out the Government’s objectives, proposals and policies for responding to the risks identified in the CCRA as detailed in [ E6, p8] “In developing the NAP for England, we have taken the highest order risks from the CCRA and working in partnership with businesses, local government and other organisations, have developed objectives, policies and proposals to address them.''
The original 2012 CCRA [ E4], and thus the 2013 NAP, draws heavily on the uncertainty analysis in UKCP09 [ E4, p9]. The impact of that report is still being felt heavily through this REF impact period. The second report was published in 2017 [E5]. The CCRA used the UK Climate Projections (UKCP09) for three – 30-year periods centred on the 2020s, 2050s and 2080s. The CCRA attempted to monetise the most important risks to the UK, and concluded that the results indicated that the net economic costs to the UK are of the order of tens of billions/year by the 2050s (in current prices). UKCP09 therefore played a key role within the Government's statutory responsibilities for assessing and responding to climate change throughout the current REF period and the updated UKCP18 is continuing this role. The ministerial foreword to the updated version, NAP 17 [ E7], links the various strands of activity as follows.
"Later this year we will be launching a revised set of UK climate projections (‘UKCP18’), replacing the current 2009 set and providing the most up to date and scientifically robust estimations of climate scenarios out to the end of this century. These projections are a key tool to enable everyone to future proof policies and activities to ensure our resilience to possible future climate change. ... so that we take back control of our fisheries and agriculture, restore nature and care for our land, rivers and seas. Using a natural capital approach ensures that we take account of all the many benefits our environment provides, and we will develop policies that ensure our seas and lands are healthy and productive and resilient to climate change."
When the UKCP18 was launched, Environment Secretary Michael Gove said: "This cutting-edge science opens our eyes to the extent of the challenge we face and shows us a future we want to avoid. By having this detailed picture of our changing climate, we can ensure we have the right infrastructure to cope with weather extremes, homes and businesses can adapt, and we can make decisions for the future accordingly.” [ E9]
PUBLIC AND PRIVATE SECTOR IMPACT OF UKCP09, UKCP18
In addition to the statutory role, the Met Office has worked with a wide range of public and private sector organisations to use UK climate impacts to inform decisions on investment amounting to billions of pounds to future-proof projects against climate change.
The UKCP09 website [ E8] contains case studies showing how the UKCP09 products have been used, with topics including: national assessment of river flows, climate change and pollution, strategic planning for flood management, changes in flood damages at a catchment scale, emergency planning, defining land capability for agriculture specifications, assessing potential vulnerabilities to climate change, future-proofing design decisions in the buildings sector, investigating coastal recession & shore profile development, storm surge and sea level rise and assessing impacts of climate change on tourism in South West England.
As early evidence of the ways in which the Met Office are rolling out the effects of UKCP18 to user groups, the document [ E10] details a set of six project leaflets. Each leaflet describes how UKCP09 has been used in the area and how UKCP18 will improve adaptation planning for climate change. The titles show some of the range of areas of application that UKCP18 affects. (i) Assessing climate change risk in Yorkshire; (ii) Future surface water flood hazard risk; (iii) Coastal cliff recession under climate change; (iv) Thermal performance of buildings; (v) Forests for the future; (vi) Water resources and drought planning.
5. Sources to corroborate the impact
E1. Murphy JM, et al., (2009) UK climate projections science report: climate projections Met Office Hadley Centre, Exeter. http://cedadocs.ceda.ac.uk/1320/1/climate_projections_full_report.pdf
E2. J.M. Murphy, et al., (2018) UKCP18 Land Projections: Science Report. Available from https://www.metoffice.gov.uk/pub/data/weather/uk/ukcp18/science-reports/UKCP18-Land-report.pdf
E3. Sexton DMH, et al., (2012) Multivariate probabilistic projections using imperfect climate models part 1:outline of methodology, Climate Dynamics, 38, 2513-2542 https://doi.org/10.1007/s00382\-011\-1208\-9
E4. The UK Climate Change Risk Assessment 2012, Evidence Report. Defra 2012. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/69487/pb13698-climate-risk-assessment.pdf
E5. The UK Climate Change Risk Assessment 2017 (Presented to Parliament pursuant to Section 56 of the Climate Change Act 2008) https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/584281/uk-climate-change-risk-assess-2017.pdf
E6. The National Adaptation Programme, Making the country resilient to a changing climate, July 2013 ( www.gov.uk/defra). https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/727259/pb13942-nap-20130701.pdf
E7. Climate change: second national adaptation programme (2018 to 2023) https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/727252/national-adaptation-programme-2018.pdf
E8. Case studies UKCP09 https://webarchive.nationalarchives.gov.uk/20181204111030/http://ukclimateprojections-ukcp09.metoffice.gov.uk/23081
E9. UK Climate Projections Project Newsletter April 2019 https://www.metoffice.gov.uk/binaries/content/assets/metofficegovuk/pdf/research/ukcp/newsletters/ukcp18-newsletter.april-2019.pdf
E10. UKCP18 Demonstration Projects https://www.metoffice.gov.uk/research/collaboration/ukcp/ukcp18-demonstration-projects
- Submitting institution
- University of Durham
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Political
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
Since 2005, Craig’s research around quantification of uncertainty has had a significant impact in the context of European food safety. Using a range of methods that include statistical modelling and reasoning, Craig has contributed to significant change in the way the European Food Safety Authority (EFSA) addresses uncertainty in its scientific assessments for the EU Commission and member states. Quantifying uncertainty in these assessments plays a critical role in risk management decisions that relate to food production, packaging and consumption; ultimately protecting consumers, animals and the environment from food-related risks. Craig’s research has also contributed directly to the treatment of uncertainty in EFSA topic-specific guidance and individual assessments.
2. Underpinning research
Craig is a statistician who works mostly from the Bayesian subjectivist viewpoint on statistics. Craig’s research focus is on methodology for quantification of uncertainty in real world problems based on careful application of statistical principles and methodology and in-depth knowledge of applications. Key components are Bayesian random-effects modelling, imprecise probability and expert knowledge elicitation.
For the decade to 2002, Craig worked mostly on statistical methodology for computer models as part of a Durham group specialising in Bayesian emulation of computer models, use of expert knowledge elicitation to quantify expert judgement using probability, and statistical modelling of discrepancies between models and reality. Funding came from EPSRC and oil industry partners. The relevance to the case study is its focus on quantification of uncertainty about the real world based on deterministic and statistical models of data and elicitation of expert judgement using probability, laying the ground for the research and impact reported in the case study.
Since 2005, Craig’s research has largely been concerned in one way or another with quantification of uncertainty in relation to risk and benefit assessments in the context of food safety. Much of the research has been conducted with or for EFSA, often as a member of an EFSA working group. This has involved a diverse range of activities and applications; the common thread is quantification of uncertainty using statistical modelling and reasoning. The research has been published in academic journals and by EFSA in its own journal. Examples of relevant academic journal articles are [R1] [R2] and [R3].
Publications in the EFSA journal are of several different kinds: scientific assessments in relation to risk (or sometimes benefit) of potential policy changes/actions, commissioned by EU state risk managers; guidance documents developed to influence the conduct of assessments; scientific opinions combining review of the relevant scientific literature with evaluation of how science can be brought to bear on areas of EFSA activity. Any of these may require development of new methodology or novel analysis and interpretation of data, and this may be included in the main text or appendices of the resulting publication.
Craig’s topic-specific research, started in the EFSA context and contributing to impact, includes:
Bayesian statistical modelling of pesticide residues in food, probabilistic modelling of dietary exposure to pesticides and assessment of cumulative risk from multiple pesticides in human diet (see part 4 of this case study).
Statistical methods in ecotoxicological risk assessment, beginning with an EFSA Scientific Opinion for aquatic species [R4], and continuing in academic journal articles (for example [R1] and [R2]) and later EFSA scientific opinions for other species groups.
Statistical analysis and reasoning for dermal exposure to pesticides. The research is contained in [R5], an appendix to the EFSA guidance document, written in collaboration with Dr Guillot of EFSA, and also in the report of an EFSA funded research project investigating the applicability of in silico methods of predicting dermal absorption.
Craig’s research on generic methodology and tools for quantifying uncertainty in risk (or benefit) assessments started with research projects ([R3] was one product) funded by the UK Food Standards Agency (UK FSA) and led to membership of the EFSA working group developing a new guidance document on the assessment of uncertainty to cover all areas of EFSA’s activity. The first EFSA output was a scientific opinion on the underlying science and principles [R6] which led, after a public consultation phase, to a new guidance document (see part 4 of this case study). The opinion and guidance document provide a coherent approach to uncertainty analysis focussed on the actual uncertainty of conclusions provided to EU risk managers and rooted fundamentally in the subjective Bayesian approach to uncertainty and statistical inference, including generalisation to imprecise probabilities. The opinion and guidance document also include a novel method for applying probability bounds analysis to imprecise probabilities describing both uncertainty and variability, intended to allow assessors to provide useful partial probability statements to risk managers based on extremely limited probability judgements obtained by expert knowledge elicitation. The key principle of the guidance is that the analysis should focus on the uncertainty in conclusions/recommendations provided to decision-makers, i.e. uncertainty about reality rather than uncertainty about parameters in models.
In summary, Craig’s research has focussed on statistical tools for quantification of uncertainty in risk assessments with a particular emphasis on issues relevant to food safety.
3. References to the research
Craig, P.S., Hickey, G.L., Luttik, R. and Hart, A. (2012), Species non‐exchangeability in probabilistic ecotoxicological risk assessment. Journal of the Royal Statistical Society: Series A (Statistics in Society), 175: 243-262. https://doi.org/10.1111/j.1467-985X.2011.00716.x
Hickey, G.L., Craig, P.S., Luttik, R. and de Zwart, D. (2012), On the quantification of intertest variability in ecotoxicity data with application to species sensitivity distributions. Environmental Toxicology and Chemistry, 31: 1903-1910. https://doi.org/10.1002/etc.1891
Boobis A, Flari V, Gosling JP, Hart A, Craig P, Rushton L& Idahosa-Taylor E (2013). Interpretation of the margin of exposure for genotoxic carcinogens – Elicitation of expert knowledge about the form of the dose response curve at human relevant exposures. Food and Chemical Toxicology, 57, pp 106-118. https://doi.org/10.1016/j.fct.2013.03.003
EFSA (2005). Opinion of the Scientific Panel on Plant health, Plant protection products and their Residues on a request from EFSA related to the assessment of the acute and chronic risk to aquatic organisms with regard to the possibility of lowering the uncertainty factor if additional species were tested . EFSA Journal, 301 , pp1--59. https://doi.org/10.2903/j.efsa.2006.301
Appendix B — Statistical Analysis of EFSA (European Food Safety Authority), Buist, H, Craig, P, Dewhurst, I, Hougaard Bennekou, S, Kneuer, C, Machera, K, Pieper, C, Court Marques, D, Guillot, G, Ruffo, F and Chiusolo, A, 2017. Guidance on dermal absorption. EFSA Journal 2017; 15(6):4873, 60 pp https://doi.org/10.2903/j.efsa.2017.4873
EFSA Scientific Committee, Benford, D, Halldorsson, T, Jeger, MJ, Knutsen, HK, More, S, Naegeli, H, Noteborn, H, Ockleford, C, Ricci, A, Rychen, G, Schlatter, JR, Silano, V, Solecki, R, Turck, D, Younes, M, Craig, P, Hart, A, Von Goetz, N, Koutsoumanis, K, Mortensen, A, Ossendorp, B, Germini, A, Martino, L, Merten, C, Mosbach‐Schulz, O, Smith, A and Hardy, A, 2018. Scientific Opinion on the principles and methods behind EFSA's Guidance on Uncertainty Analysis in Scientific Assessment. EFSA Journal 2018;16(1):5122, 235 pp. https://doi.org/10.2903/j.efsa.2018.5122
*Scientific opinion [ R4], including the technical appendix, was part of Durham’s RAE 2008 submission. It is known that it was rated at least two star by the RAE panel as all articles in the submission received at least two stars. In a testimonial for RAE 2008, Professor Hardy (then Chair of EFSA’s Plant Protection Products and their Residues Panel) wrote “Dr Craig was an ad hoc expert advising the PPR panel of EFSA. He was sole author of the technical statistical appendix, contributed most of the statistical methodology (including all of section 5 other than 5.2.2) and was substantially involved in writing the main text of the opinion”.* *The research combined mathematics, statistical modelling and data analysis to show, for risk assessment for the ecotoxicological effects of pesticides on aquatic species, the potential benefit to both risk managers and applicants for pesticide registration of encouraging the testing of more species than required in legislation by adjusting the statistical calculation used. It also showed how to improve the stability of a standard method by using Bayesian hierarchical modelling. *
[R1] and [R2] are journal articles expanding on issues in relation to ecotoxicological risk assessment: non-exchangeability of species identified and partially addressed in [R4] and intertest variability which had previously been omitted from consideration.*
**[R5] details data analyses, statistical modelling and statistical predictions intended to support the setting in the guidance document of default values to be used in the absence of in vitro data. The appendix also provides data analysis to support the method proposed in the guidance for addressing uncertainty due to limited sample size for in vitro data.
[R3] exemplifies the use of expert knowledge elicitation in the context of assessing uncertainty about the risk from exposure to genotoxic carcinogens.*
[R6] surveys the science required for uncertainty analysis, bringing it together in an approach containing several novel elements: emphasis on overall uncertainty of conclusions and recommendations, use of imprecise probability theory as a key mathematical and practical tool, probability bounds analysis for simple analysis of uncertainty about outputs of models combining variable quantities. In a testimonial for REF 2021, Professor Hardy (Chair of EFSA Scientific Committee at the time of developing [R6] and the guidance on uncertainty) confirmed that Craig was the primary provider of input on mathematical and statistical methodology and the principles behind use of probability and imprecise probability to quantify expert judgement, the lead drafting author for quantitative sections of [R6] including appendices, and the sole drafting author for Appendix B.13 Probability Bounds Analysis.*
4. Details of the impact
The impact of Craig’s work should be seen in the context of the critical role played by the European Food Safety Authority (EFSA), the European Union (EU) agency established in 2001 by the European Parliament and Council to develop and manage risk assessment policy in the EU regarding food and animal feed safety. EFSA’s scientific advice helps to protect consumers, animals and the environment from food-related risks. EFSA publications are used as the basis for policy making and implementation by the European Commission and Parliament and by individual EU member states. By contributing to a step-change in the way that EFSA addresses uncertainty, Craig’s research is crucial to decisions that have serious consequences for the health of people, animals and the environment.
In food safety risk assessment, scientists must assess the safety of a new food, pesticide or food-borne bacteria. As evidence or knowledge is always incomplete, it is important to explain how uncertainty may affect conclusions and the implications for decision-making. In quantifying these uncertainties, Craig’s methodology and tools have significantly improved EFSA’s assessments of risk and benefit in relation to food production, packaging and consumption.
One EFSA role is overall supervision of EU risk assessment for pesticides. In that role, EFSA produces and publishes risk assessment peer reviews of individual pesticides which support decisions made by individual EU member states in response to applications from the pesticide industry for authorisation for use. Risk assessment for pesticides covers the risk to humans from eating food containing pesticides, the risk to humans from exposure via the skin (for example when spraying pesticides or working with plants which have been sprayed) and the risk to other species exposed due to the use of pesticides in agricultural production.
In the period for consideration of impact for REF 2021, the ecotoxicology sections of EFSA peer review reports for 5 pesticides used the geometric mean approach proposed and justified in [R4]. For example, [E1] refined the acute risk assessment for fish species in this way for the pesticide acrinathrin. The benefit of the geometric mean approach was that applicants for authorisation were encouraged to provide test results for more species than required by legislation. Previously the test result for the most sensitive species would be used and this was effectively a disincentive to test additional species. The benefit for applicants of the changed approach was knowledge that the geometric mean would be used, thereby making authorisation more likely, while the reliability of the assessment benefited from the greater knowledge provided by additional test results
The EFSA 2018 [E5] guidance document on dermal absorption (for pesticides) made two changes to guidance on the basis of research reported in Appendix B: (i) to the calculation to allow for uncertainty about absorption on the basis of limited in vitro data; (ii) to the default values to be used in the absence of in vitro data for the pesticide to address the uncertainty arising from the absence of the data. The first of these changes means that uncertainty due to limited sample size is now taken properly into account in the assessment and the second means that the new default values are based on a transparent analysis of a large dataset taking into account uncertainty about absorption for an untested pesticide. In May 2018, the EU Standing Committee on Plants, Animals, Food and Feed recommended [E6] that the guidance be used, from August 2018, by applicants for authorisation and by EU members states in peer reviews of pesticides. The result is improved treatment of uncertainty and greater transparency about the basis for decisions.
Guidance document [E2] provides explicit direction on how to carry out uncertainty analysis in scientific assessments. The guidance was adopted by the EFSA Scientific Committee in December 2017 [E3a] and is applicable to all areas of EFSA’s work [E3b]. The most significant new feature of this guidance is the requirement that assessors should assess the overall impact of uncertainty on conclusions and that they should express the uncertainty using the mathematical language of probability. This new approach is fundamentally rooted in the subjectivist view of probability and associated methodology, areas of Craig’s research expertise. During development of the guidance, Craig was one of two lead drafting authors and had responsibility for the quantitative sections [E3b].
Following the adoption of the guidance, EFSA created a standing cross-cutting Working Group (WG) on Uncertainty [E4] and Craig has been a member since its inception. As well as experts in uncertainty, the cross-cutting WG has members from four of EFSA’s eight Scientific Panels and is mandated to support the Panels in applying the guidance in their outputs. Examples of EFSA outputs following the new guidance and containing substantial uncertainty analysis are:
the scientific opinion of the EFSA Panel on Nutrition, Novel Foods and Food Allergens on dietary reference values for sodium [E7], produced at the request of the European Commission. Sodium is an important element in human diet but also a source of health risk, and both risk and benefit are uncertain. The scientific opinion provides a transparent account and quantification of uncertainties affecting the conclusions as required by [E2].
the EFSA Panel on Plant Health Guidance on quantitative pest risk assessment [E9] which is used routinely by the Panel in subsequent assessments of risk of entry, spread and consequent environmental and economic damage for plant and plant-based pests.
The scientific opinion of the EFSA Panel on Biological Hazards on control options for Campylobacter in broilers at primary production [E10], produced at the request of the European Commission. Campylobacter is a significant contributor to food-poisoning resulting from consumption of contaminated poultry.
EFSA Scientific Reports on cumulative risk from presence of multiple pesticides in human diet, produced by EFSA in the context of the EU regulations requiring that (a) decisions about the maximum levels permitted in food should take into account cumulative effects as and when methods for doing so become available and (b) pesticides should have no harmful effects – including cumulative effects – on humans. The first such report is [E8]. Previously, risk to consumers from the presence of pesticide residues in food was only assessed substance by substance despite the possibility of multiple substances contributing cumulatively to a particular harmful effect.
Uncertainty assessment requires expert training both for assessors and for decision-makers who use assessments. To support implementation of the uncertainty guidance, EFSA commissioned a series of training courses (for example [E11]). Craig was one of two tutors who prepared and delivered the training for four courses on general application of the guidance, delivered to EFSA scientific staff and external members of the ten scientific panels, and three courses each tailored to the needs of a single scientific panel and attended by all panel members and staff from related EFSA units. Each 1.5-2 day course was attended by 25-30 scientists full-time. Craig was a facilitator at a two-day 2017 EFSA workshop reviewing 13 case studies covering a wide range of EFSA activities and conducted during a one-year trial period for the draft guidance. Participants were relevant risk managers (decision-makers) from the European Commission and the scientific experts who worked on each case study.
Further impacts derive from the influence of EFSA outputs on other international groups involved in food safety, e.g. the US Environmental Protection Agency, the Organisation for Economic Cooperation and Development (OECD), the Joint Food and Agriculture Organization of the United Nations (FAO)/World Health Organization (WHO) Meeting on Pesticide Residues, and the WHO. During the period for consideration of impact for REF 2021 and following from his work with EFSA: (i) Craig joined an OECD working group on international harmonisation in relation to dermal absorption of chemicals; (ii) Craig delivered training on uncertainty to the EU Joint Research Centre, the EU Scientific Committee on Health Environment and Emerging Risks, the UK Food Standards Agency and Finland’s food safety agency EVIRA.
5. Sources to corroborate the impact
European Food Safety Authority, 2013. Conclusion on the peer review of the pesticide risk assessment of the active substance acrinathrin. EFSA Journal 2013; 11(12):3469, 82 pp. https://doi.org/10.2903/j.efsa.2013.3469.
EFSA (European Food Safety Authority) Scientific Committee et al, 2018. Guidance on Uncertainty Analysis in Scientific Assessments. EFSA Journal 2018;16(1):5123, 39 pp. https://doi.org/10.2903/j.efsa.2018.5123
Evidence in relation to [E2]: (a) minutes of EFSA Scientific Committee recording adoption of the guidance; (b) testimonial letter from Professor Tony Hardy confirming Craig's role in the development of [E2] and [R6].
EFSA (European Food Safety Authority) et al, 2017. Guidance on dermal absorption. EFSA Journal 2017; 15(6):4873, 60 pp. https://doi.org/10.2903/j.efsa.2017.4873
EFSA (European Food Safety Authority) et al, 2018. Technical report on the outcome of the pesticides peer review meeting on general recurring issues in mammalian toxicology. EFSA supporting publication 2018: 15(9):EN‐1485. 11 pp. https://doi.org/10.2903/sp.efsa.2018.EN-1485
EFSA (European Food Safety Authority) et al, 2019. Scientific Opinion on the dietary reference values for sodium. EFSA Journal 2019;17(9):5778, 191 pp. https://www.efsa.europa.eu/en/efsajournal/pub/5778
EFSA (European Food Safety Authority) et al, 2020. Scientific report on the cumulative dietary risk characterisation of pesticides that have chronic effects on the thyroid. EFSA Journal 2020;18(4):6088, 71 pp. https://efsa.onlinelibrary.wiley.com/doi/10.2903/j.efsa.2020.6088
EFSA (European Food Safety Authority) et al, 2018. Guidance on quantitative pest risk assessment. EFSA Journal 2018;16(8):5350, 86 pp. https://efsa.onlinelibrary.wiley.com/doi/10.2903/j.efsa.2018.5350
EFSA (European Food Safety Authority) et al, 2020. Update and review of control options for Campylobacter in broilers at primary production. EFSA Journal 2020;18(4):6090, 89 pp. https://www.efsa.europa.eu/en/efsajournal/pub/6090
- Submitting institution
- University of Durham
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Technological
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
Durham University’s Statistics group have produced cutting-edge models, methods and software for estimating levels of radiation doses in those exposed in a radiological incident. Their work involved developing statistical methodology that goes beyond traditional procedures, specifically overdispersed count data biomarkers. The procedures developed are referenced in an ISO standard and incorporated into the emergency response plans under RENEB, a European network specialising in radiation dose estimates. The team have developed an Uncertainty Quantification (UQ) framework for the innovative gamma-H2AX protein biomarker – enabling quicker triage compared to other biomarkers, and hence improved preparedness in case of a mass exposure scenario – that has been adopted by Public Health England (PHE) into the standard operating procedures for its commercial biodosimetry unit.
2. Underpinning research
Following a radiation incident involving exposure or suspected exposure of individuals to ionizing radiation, a fast and reliable assessment of the contracted dose is essential for effective diagnosis and treatment. Such an assessment is possible through biomarkers, which allow the contracted dose to be deduced from the damage which the radiation has caused inside human blood cells. The ‘gold standard’ for this purpose, based on counts of dicentric chromosomes, currently has a total global capacity of a few thousand samples a week, which would clearly be inadequate in the case of a large-scale nuclear accident. Recently, alternative biomarkers based on proteins (specifically, the phosphorylated gamma-H2AX histone) have been developed. While biomarkers of this new generation are considerably quicker to process and allow for larger throughput, no statistical routines were available (prior to our work) to infer the dose estimate and its uncertainty from the biomarker measurement.
In this context, the contribution of the research underpinning this case study is twofold:
Publications [R1-R3] develop and discuss, in the context of a wide range of cytogenetic and biomolecular radiation biomarkers, modelling strategies for a correct representation of the uncertainty inherent in such biomarkers. Specifically, publications [R1, R3] demonstrate that a commonly made assumption (that of equidispersion, or mean=variance, underlying the Poisson distribution) is wrong or misleading for most biomarkers apart from a few idealized scenarios, and recommend specific alternative models to be used under the violation of this assumption. While a misspecification of this distributional assumption has only a minor impact on the dose estimates themselves, it has a strong impact on the uncertainty assessment. Publication [R2] critically assesses the state of the art in uncertainty quantification for radiation biomarkers; specifically it shows that ignorance of this uncertainty can lead to incorrect conclusions (triage classifications, treatments, etc), with potentially severe consequences for the individuals concerned.
Our recent work [R4], building on exploratory work in [R3], focuses on the development of statistical methodology for dose estimation and uncertainty quantification for the innovative gamma-H2AX assay, which has so far rarely been used in laboratories due to the lack of available software and agreed standards. The H2AX histone is a DNA-repair protein, that is, once a cell gets exposed to ionizing radiation and a double-strand break has occurred, it coordinates the repair of the damaged DNA and in this process phosphorylates, becoming gamma-H2AX. This phosphorylation leads, after addition of fluorophore-labelled antibodies, to fluorescent dots which can be counted under a microscope, and then related to radiation dose via statistical models. The specific challenge in the implementation of the gamma-H2AX biomarker is the presence of multiple types of uncertainties (inter-and intra-individual variation, inter-laboratory variation, dependence on factors such as temperature at irradiation, scorer, shipment, etc), resulting in a large overall uncertainty in comparison to cytogenetic biomarkers. The work in [R4] develops statistical methodology to take the different layers of uncertainty into account, and comes with a web applet which implements this methodology ( http://shinur.unirioja.es/apps/h2axDE/). It is shown that, even if the uncertainty is large but quantifiable, the biomarker is still useful for triage purposes, for instance to distinguish a severely irradiated individual requiring urgent medical treatment from a ‘worried well’.
3. References to the research
In the publications below, contributions of authors belonging to the Department of Mathematical Sciences at Durham University are given in square brackets.
[R1] Oliveira, María [40%], Einbeck, Jochen [30%], Higueras, Manuel, Ainsbury, Elizabeth, Puig, Pedro & Rothkamm, Kai (2016) . Zero-inflated regression models for radiation-induced chromosome aberration data: A comparative study. Biometrical Journal 58 : 259-279, http://dro.dur.ac.uk/17160/ , https://doi.org/10.1002/bimj.201400233
[R2] Ainsbury, Elizabeth A., Higueras, Manuel, Puig, Pedro, Einbeck, Jochen [10%], Samaga, Daniel, Barquinero, Joan F., Barrios, Lleonard, Brzozowska, Beata, Fattibene, Paola, Gregoire, Eric, Jaworska, Alicija, Lloyd, David, Oestreicher, Ursula, Romm, Horst, Rothkamm, Kai, Roy, Lawrence, Sommer, Sylwester, Terzoudi, Georgia, Thierens, Hubert, Trompier, Francois, Vral, Anne & Woda, Clemens (2017). Uncertainty of fast biological radiation dose assessment for emergency response scenarios. International Journal of Radiation Biology 93 : 127-135,
http://dro.dur.ac.uk/19789/ , https://doi.org/10.1080/09553002.2016.1227106
[R3] Einbeck, Jochen [50%], Ainsbury, Elizabeth, Barnard, Stephen, Oliveira, Maria [10%], Manning, Grainne, Puig, Pere & Badie, Christophe (2017). On the Use of Random Effect Models for Radiation Biodosimetry. In: Extended Abstracts Fall 2015. Ainsbury, E., Calle, M., Cardis, E., Einbeck, J., Gómez, G. & Puig, P., Research Perspectives CRM Barcelona, Springer . 7: 89-94 , http://dro.dur.ac.uk/21801/, https://doi.org/10.1007/978-3-319-55639-0_15
[R4] Einbeck, Jochen [40%], Ainsbury, Elizabeth, Sales, Rachel [20%], Barnard, Stephen, Kaestle, Felix, and Higueras, Manuel (2018): A statistical framework for radiation dose estimation and uncertainty quantification from the gamma-H2AX assay. PloS One 13(11): e0207464, http://dro.dur.ac.uk/27000/ , https://doi.org/10.1371/journal.pone.0207464
This body of work was instigated by a 29K NIHR grant ‘Random effects modelling for radiation biodiosimetry’ (Principal investigator: Einbeck; postdoctoral researcher: Oliveira; run time: Feb-Dec 2014, Ref. NIHR-RMOFS-2013-03-04; panel feedback: ‘A strong application that brings a talented researcher into the medical field. The project innovates by transferring known models into a new area. The research plan was plausible for the timescale and will result in new collaborations’.) This work resulted in [R1] and led to an invitation to the Centre de Recerca Matematica (Barcelona) in November 2015 as a Visiting Researcher, where [R3] was produced. The work on [R4] was supported by Horizon 2020 COST Action IC1408 ‘Computationally-intensive methods for the robust analysis of non-standard data (CRoNoS)’. All publications involve co-authors from Public Health England; [R2] and [R4] involve co-authors from at least two Public Health Institutions inside and outside the UK. Publications [R1] and [R4] are Q1 journals on Scimago.
4. Details of the impact
.
Practice and policy impact
The work in [R1] to [R4] shaped guidelines and procedures concerning the choice of models and the quantification of uncertainty for count data biomarkers. Citing the letter from Public Health England, ‘the methods … have been incorporated into the retrospective dosimetry elements of the EU radiation emergency response plans under the RENEB network’ [E1]. RENEB (Running the European Network of Biological and retrospective Physical dosimetry) is a major network of public health organizations and research institutes/laboratories funded by the 7th EU framework EURATOM Fission Programme [E1, E2], with the mission to provide ‘rapid, comprehensive and standardised methodology for individualised dose estimation in case of large-scale radiological events in Europe and beyond’ ( http://www.reneb.net/). As a further activity linked to RENEB, ‘as a collaboration between Universitat Autònoma de Barcelona (UAB), Bundesamt für Strahlenschutz (BfS), Durham University (DU), Institut de Radioprotection et de Sûreté Nucléaire (IRSN), Universidad de la Rioja (UdR), and Public Health England (PHE)’ [E4], dosimetry software for multiple biomarkers, including the code produced for [R1], have been developed into a ‘BioDose Tools’ package [E2]. This tool not only provides an easy-to-use web-applet (available at https://aldomann.shinyapps.io/biodosetools-v3/) for biological dosimetry laboratories worldwide [E8], but also serves to ‘simplify and standardize uncertainty estimation in biological dosimetry’, hence making an important contribution to the ‘international community of biological dosimetry’ [E2].
In ISO standard 20046:2019(en) [E3], Subsection 12.1.3., [R2] is given as the reference for methods to derive confidence limits for overdispersed count data. That subsection discusses how to correctly derive the sampling distribution of the biomarker count under violation of the Poisson assumption, and, exclusively referring to our work, which distributions should be employed by laboratories in this case. Standards play a crucial role to ‘harmonize the procedure of biological and retrospective physical dose assessments’, and ISO standards are for instance used by RENEB for ‘certification of the laboratory/department/institute’ or ‘accreditation/certification of one or several techniques according to standard ISO’ (Gregoire et al, 2016).
While the technological feasibility of the gamma-H2AX histone as a radiation biomarker was established in the relevant literature about a decade ago, its practical implementation by laboratories had so far been hampered by a lack of methodology to transfer the biomarker measurement into a dose estimate. Our promising initial results [R3] using H2AX biomarker data, which were previously collected by PHE but had been left unpublished (due to lack of an obvious methodology to analyse them), convinced PHE that this is an area which required further resource and research, and led ‘to a revised assessment of dose estimation techniques and biomarkers’ [E1]. One of the actions arising from this was a research visit of Felix Kaestle from the German Federal Office for Radiation Protection (Bundesamt für Strahlenschutz; BfS) to PHE in the summer of 2017 in order to carry out a substantial experimental study using this biomarker [E1]. These data, along with other existing data sets, were used in DU to devise a new methodological framework for radiation dose estimation from this assay, including quantification of uncertainty [R4]. This methodology is not yet part of the BioTools package, but we have produced an applet ‘DoseEstimateH2AX’ alongside publication [R4] which is available online (since November 2018). The methods are now included in PHE’s standard operating procedures [E1]. The letter [E1] also expresses that this collaboration had been ‘hugely important and influential’ and of ‘direct benefit’ to PHE.
We are in contact, through LD-RadStatsNet, the BioDose project, and via RENEB, with essentially every laboratory or public health institution in Europe which is able to carry out gamma-H2AX-foci analyses (many of these are co-authoring [R2].). So, the reach of our methodology, across Europe, is close to 100%. The letter by the Bundesamt für Strahlenschutz [E2] exemplifies the significance of our work to public health institutions, and the explicit mentioning in a NATO report [E7] its visibility.
Societal and economic impact
This body of work has contributed to an increased appreciation in the field that dose estimation from biomarkers requires sophisticated statistical methodology [E1, E5]. Specifically, [R1] has raised awareness that the inferences obtained from cytogenetic radiation biomarkers will depend on the assumed response distribution , and, by extension, that the resulting dose estimates carry uncertainties, which can lead to incorrect triage [R2], hence urgently requiring methods to quantify these uncertainties [E1, E5]. However, towards the middle of the last decade, there was a clearly identifiable skills gap in the field of dosimetry, with ‘the proportion of individuals with formal mathematical and statistical training [in laboratories or public health institutions] relatively low’ [E5]. PHE and DU were involved in several joint activities to fix this skills gap, including the organization of a workshop aimed at Statisticians (2015, Barcelona) [E5, E6] and the creation of a network (LD-RadStatsNet) [E5, E6], which in turn was also involved in planning training activities [E6]. These activities have contributed to establishing biodosimetry as a statistical discipline, as evidenced for instance by a dedicated session ‘Statistical Methods in Radiation Research’ at the CMStatistics 2018 conference ( http://www.cmstatistics.org/CMStatistics2018/), and several students working on (statistical) PhD projects in this field (some based at public health institutions, such as BfS, and some in academia, including DU), thereby producing a generation of skilled researchers in statistical biodosimetry.
According to [E1], the methods developed in the framework of the collaborative work with PHE ‘are now used by PHE for commercial biological dose estimation’ in PHE’s Cytogenetics and Pathology Group, which runs the UK’s commercial Chromosome Dosimetry Service ( https://www.phe-protectionservices.org.uk/cds/). This unit has responsibility for emergency preparedness in retrospective biological dosimetry for triage purposes in the case of a large- scale radiation accident or incident.
The impact of this work, especially [R4], on the general public is a better preparedness in the case of a mass radiation casualty scenario, by providing methodology for the production of meaningful dose estimates and uncertainty bounds, hence enabling effective triage in a much shorter time than through previously existing methods. While the full impact of this innovation will hopefully never be realised, recent events have shown that emergency response preparedness, and in particular testing capability, is vitally important. This holds both for the biomedical technology on which such diagnostics are built, and the computational and statistical routines which give meaning to, and allow principled decisions based on, the raw results of the biomarker. The importance of statistical biodosimetry for emergency response preparedness is, for instance, highlighted in [E8] (page 19, 1st column, of the 2018/19 report), referring to potential large-scale irradiation scenarios caused by terror attacks or nuclear weapons.
References
Gregoire E., et al. (2016). The harmonization process to set up and maintain an operational biological and physical retrospective dosimetry network: QA QM applied to the RENEB network, International Journal of Radiation Biology, 93, 81-86, https://www.tandfonline.com/doi/full/10.1080/09553002.2016.1206232
5. Sources to corroborate the impact
Letter from Public Health England (Dr Elizabeth Ainsbury), dated 18/07/2019
Letter from Bundesamt für Strahlenschutz (Dr Ulrike Kulka), dated 12/12/2019
ISO Standard 20046:2019(en), Radiological protection — Performance criteria for laboratories using Fluorescence In Situ Hybridization (FISH) translocation assay for assessment of exposure to ionizing radiation, page 19.
BioDoseTools contributors, Screenshot from https://rdrr.io/github/biodosimetry-uab/biodosetools/f/inst/app/www/contributors_app.md
LDRadStats-2015 final meeting report, including workshop programme, from melodi-online.eu
AIR2 bulletin, including evidence for the creation of a network (page 4), also citing [R1]
North Atlantic Treaty Organization (NATO), STO Technical Report TR-HFM-222, Biological Effects of Ionising Radiation, page 1-9, also citing [R1]
Annual reports of the Bundesamt für Strahlenschutz [in German; 2 merged documents]. Annual report 2017/18 cites [R2] on page 61 (relating to content on page 19, 3rd column, where also a reference to the ‘Online-Software Tool’ [E4] is made); Annual report 2018/19 cites [R4] on page 76 (relating to content on page 72, 1st column)