Impact case study database
Search and filter
Filter by
- University of Edinburgh
- 10 - Mathematical Sciences
- Submitting institution
- Heriot-Watt University, University of Edinburgh (joint submission)
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Economic
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
Pension funds, life insurers and regulators are concerned about the financial consequences of longevity risk: the financial risk that, in aggregate, people live longer than anticipated.
Novel stochastic models for the assessment of longevity risk (the Cairns-Blake-Dowd – CBD – family) in life insurance and pensions proposed by Cairns and co-authors in two landmark papers have been adopted by insurers in the UK, France and the US, actuarial consultancies in the UK and Germany, UK and European insurance regulators, specialist software providers, and in professional education.
The CBD models have played a central role in the transfer of GBP10’s of billions of pension liabilities from pension funds to multinational insurers including the GBP16,000,000,000 transfer of [text removed for publication] pension liabilities in [text removed for publication].
Use of the models provides an improved assessment of the financial consequences of longevity risk. This has enhanced the security of both pension funds and insurers and, through good risk management and regulation, has reduced the risk of insolvency.
2. Underpinning research
Context: Adverse market conditions and other industry developments at the beginning of the 2000’s caused pension funds and insurance companies to focus attention on risk assessment and risk management of both assets and liabilities. For pension funds and annuity providers, longevity risk – the risk that, in aggregate, people live longer than anticipated – was identified as a specific risk that needed to be addressed. This led Cairns and co-authors Blake (City University) and Dowd (Durham) to develop a new family of stochastic models for future mortality (now widely known as the CBD family, after the three authors).
Underpinning research: [3.1] introduced what is popularly known as the CBD model with application to English & Welsh mortality. This paper recognised that, although mortality improvements at different ages are correlated with each other, they are not perfectly correlated. This led to the proposal of a robust two-factor model (period or calendar-year effects) that would be particularly suitable for pension funds and life insurance annuity portfolios. Additionally, the structural simplicity of the model helped it to gain popularity.
Publication of this paper led to a collaborative project with the international bank [text removed for publication]. This collaboration resulted in several journal articles co-authored with members of the [text removed for publication] longevity team, of which the most important were [3.2] and [3.3]. In [3.2], the original CBD model was generalised to the second generation of CBD models, the most popular of which is known as “M7”. M7 included a third period effect. This paper also confirmed earlier research that cohort effects linked to year of birth were also significant factors in forecasting mortality. Papers [3.2] and [3.3] also highlighted the significance of model risk – the risk associated with excessive reliance on a single model – and the robustness of models. The collaboration also led to the development of the LifeMetrics open source software [3.4], the purpose of which was to help potential users gain confidence in the use of the CBD and other stochastic mortality models.
The original work has now been extended to modelling of multiple populations (e.g. stratified by socio-economic group) as exemplified by [3.5] and [3.6].
In each of [3.1, 3.2, 3.3], Cairns led the mathematical developments of the models and quantitative analysis.
Pathway to Impact
Research collaborator Blake organised the first of a successful series of conferences in 2005 on longevity risk and capital markets that continues to this day. This annual conference is a key meeting place for researchers and practitioners and acted as a key conduit for Cairns to promote the merits of the CBD family of models to practitioners. Following the 2006 conference and publication of [3.1], Cairns, Blake and Dowd began work with [text removed for publication] to develop the modelling and risk measurement theme as part of their objective to develop the longevity market. This collaboration led to several papers, the most notable of which was Cairns et al. (2009) [3.2].
[3.1,3.2] are now amongst the most cited papers in actuarial science.
3. References to the research
[3.1] Cairns, A.J.G., Blake, D., and Dowd, K. (2006). A two-factor model for stochastic mortality with parameter uncertainty: Theory and calibration. Journal of Risk and Insurance, 73: 687-718. DOI: 10.1111/j.1539-6975.2006.00195.x
[3.2] Cairns, A.J.G., Blake, D., Dowd, K., Coughlan, G.D., Epstein, D., Ong, A., and Balevich, I. (2009). A quantitative comparison of stochastic mortality models using data from England & Wales and the United States. North American Actuarial Journal, 13: 1-35. DOI: 10.1080/10920277.2009.10597538
[3.3] Cairns, A.J.G., Blake, D., Dowd, K., Coughlan, G.D., Epstein, D., and Khalaf-Allah, M. (2011) Mortality density forecasts: an analysis of six stochastic mortality models. Insurance: Mathematics and Economics, 48: 355-367. DOI: 10.1016/j.insmatheco.2010.12.005
[3.4] Cairns, A.J.G. (2007+updates) LifeMetrics open source software. Originally www.lifemetrics.com. Now available at www.macs.hw.ac.uk/~andrewc/CBD.html
[3.5] Cairns, A.J.G., Blake, D., Dowd, K., Coughlan, G.D., and Khalaf-Allah, M. (2011) Bayesian Stochastic Mortality Modelling for Two Populations. ASTIN Bulletin 41: 29-59. DOI: 10.2143/AST.41.1.2084385
[3.6] Cairns, A.J.G., Kallestrup-Lamb, M., Rosenskjold, C.P.T., Blake, D., and Dowd, K., (2019) Modelling Socio-Economic Differences in Mortality Using a New Affluence Index. ASTIN Bulletin 49: 555-590. DOI: 10.1017/asb.2019.14
4. Details of the impact
Introduction
CBD models [3.1, 3.2] have been used to assess the potential impact of longevity risk on the future solvency of insurance companies writing annuities and pension funds in the UK, Europe and the US. Use of the models helps these institutions to (a) assess more accurately their capital requirements (including regulatory) to ensure a secure future for policyholders, and (b) price more accurately longevity risk transfers between institutions. In turn, this provides policyholders and pensioners with greater confidence that their promised benefits will be covered for the next 30, 40, 50 years.
Members of the CBD family are often referred to by practitioners by their “M” numbers, e.g. M5, M7 (see e.g., [5.1, 5.2, 5.3, 5.4]).
Impacts
Software
Growing interest in the CBD family from insurance companies led the specialist actuarial software provider Longevitas to incorporate the CBD family into its Projections Toolkit [5.1]. This software is used by UK and US insurers (e.g. [text removed for publication] [5.2]) and its embedded suite of models facilitates insurance company compliance with the UK’s Prudential Regulatory Authority (PRA) guidance on the use of stochastic mortality models.
Other stakeholders, including [text removed for publication] [5.5] and the European insurance regulator EIOPA [5.6, p 65], made use of an open-source R stochastic mortality modelling package, StMoMo, [5.4], first released in 2015 that builds on [3.4]. A significant proportion of StMoMo is devoted to the CBD family (see [5.4, pages 1, 3, 6-7, 20, 27-35, 39, 49-50]).
Regulation of European Insurers
Insurers in the EU have, since 2016, been subject to the Solvency II regulatory environment. This governs how much capital insurers are required to hold to cover their future uncertain liabilities. Required capital consists of the market consistent value (MCV) plus the Solvency Capital Requirement (SCR). The SCR is an additional amount over the MCV to cover the risk that, alongside other risks, mortality rates fall at a faster rate than anticipated. The SCR can be calculated using a simple stress test or through use of a stochastic internal model. Larger insurers often prefer to use the latter, as these allow for a more accurate, company-specific analysis of all major risks.
Two distinct but related regulatory impacts can be identified.
For insurers using stochastic internal models, the UK’s Prudential Regulatory Authority (PRA) recommends use of the CBD family [5.7, p 8] alongside three other families of model (with the use of distinct families to address model risk) to assess capital requirements for longevity risk. Out of these families, [text removed for publication] recommends CBD-M9 as being one of only two models suitable for a wide age range [5.8, slide 68].
For insurers using the simpler longevity stress test, the CBD model was used by the EU insurance regulator, EIOPA, alongside one other model, to verify the suitability of the 20% longevity stress [5.6, pages 60-73] using the StMoMo software package [5.4]. Responses to the EIOPA consultation (including institutions in the Netherlands, France and Ireland and pan-European professional bodies) indicate that the CBD model is in widespread use around Europe.
The PRA approach to the assessment of model risk and the range of models employed reflects the influence of Cairns et al. (2009) (see e.g. [5.5]). As a result of the PRA guidance, CBD models and the model-risk framework of Cairns et al. (2009) are widely used by insurers in the UK and elsewhere [5.9, 5.10]. Quoting UK and EU insurers, [3.2] “ was cited in the governance of our [text removed for publication] *Solvency II internal model and helped to mould our approach to the assessment of longevity risk.*” [5.9], and “ has impacted significantly on our work in three ways: their proposal of a number of new and innovative stochastic mortality models; their finding that model risk can be quite significant; and their approach to model selection using multiple criteria” [5.10].
Derisking of Defined Benefit (DB) Pension Funds
Many DB pension funds are seeking to reduce their exposure to longevity and investment risks. Derisking is often achieved, in part, through longevity hedges such as longevity swaps and bulk buy-outs with insurers in the UK or overseas (such as [text removed for publication] [5.2]). Insurers receiving the risk from pension funds use CBD models (a) to assess a best estimate price, (b) to assess the degree of longevity risk embedded within each portfolio of pensions and then (c) to determine a risk premium to be charged for acceptance of the risk. UK insurers (e.g. [5.9]) taking on such risks are then subject to the Solvency II regulations discussed above.
Longevity Risk Transfers
There has been “ approximately $200 billion [USD] of longevity risk transfer since 2014” ([5.2]). [text removed for publication] states [5.2] “ Over the period 2014-2017, we used the CBD models for pricing all of our global transactions, covering a total transfer of liabilities of approximately $40 billion [USD]. This included the [text removed for publication] longevity transaction [text removed for publication] , covering [text removed for publication] of pension liabilities in the [text removed for publication] pension scheme in [text removed for publication] *. … The use of the CBD models was central in establishing confidence in the pricing of all of these transactions.*”
Actuarial Consultancies
CBD models and their descendants are used by actuarial consultancies in the UK and elsewhere in their advisory and development work [5.5, 5.3]. [text removed for publication] [5.5] states that [3.2] is the go-to, fundamental reference that has brought order and rigour to what had, at times, been a somewhat naïve longevity risk-transfer market. [5.5] also states that [3.5] and e.g. [3.6] are fundamental works in helping his firm and clients understand and model socio-economic mortality differentials.
Professional Education
Recognising the role of CBD models and model risk in longevity risk management, the German Actuarial Association (DAV) recommends [3.2] as part of the DAV’s reading for its Chartered Enterprise Risk Actuary qualification (CERA) [5.11].
5. Sources to corroborate the impact
- [5.1] Longevitas Projections Toolkit software *(*updates highlighting inclusion of CBD models in versions 2.7.1/3/7), https://www.longevitas.co.uk/site/news/version2.7.7oftheprojectionstoolkit.html
[5.2] Letter of support [text removed for publication].
[5.3] [text removed for publication]. Presentation at the Longevity 14 Conference, Amsterdam, 2018 by [text removed for publication].
[5.4] StMoMo: An R package for Stochastic Mortality Modelling reference manual, https://cran.r-project.org/web/packages/StMoMo/StMoMo.pdf
[5.5] Letter of Support [text removed for publication].
[5.6] EIOPA (2018) (the EU insurance regulator) EIOPA’s second set of advice to the European Commission on specific items in the Solvency II Delegated Regulation. EIOPA-BoS-18/075, https://www.eiopa.europa.eu/sites/default/files/publications/submissions/eiopa-18-075-eiopa_second_set_of_advice_on_sii_dr_review.pdf
- [5.7] Prudential Regulatory Authority guidance (PRA; part of the Bank of England and responsible for the regulation of UK insurance companies) “Reflections on the 2015 Solvency II internal model approval process”, https://www.bankofengland.co.uk/prudential-regulation/letter/2016/sam-woods-reflections-on-2015-solvency-ii-model-approved-process
[5.8] [text removed for publication]. Presentation at the International Mortality and Longevity Symposium, London, September 2016 by [text removed for publication].
[5.9] Letter of support from [text removed for publication].
[5.10] Letter of support from [text removed for publication].
[5.11] European Actuarial Academy, CERA Education seminar in Underwriting Risks in Life and Health Insurance. Block 11, DAV CERA Module B: Taxonomy, Modelling and Mitigation of Risks, 2019. (p. 39), Professional teaching materials not available online, pdf provided.
- Submitting institution
- Heriot-Watt University, University of Edinburgh (joint submission)
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Economic
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
A reliable electricity supply underpins all aspects of modern economy and society. This study describes how Maxwell Institute research has supported:
Design of the GB Electricity Capacity Market, which procures capacity to ensure an appropriate level of supply reliability for the whole country. Specific contributions are: how the volume of capacity to procure is determined given uncertainty in planning background 4 years ahead, including consideration of low wind power availability; and how energy storage’s finite energy capacity should be considered in capacity auctions;
Revision of IEEE Standard 859 on terms for electricity transmission reliability data. This underpins recording of network reliability data and hence reliability risk assessments across North America.
2. Underpinning research
Electricity resource adequacy assessment is the assessment of risk of future shortfalls of electricity supply capacity versus demand. This is used in GB as the basis for decision making in a capacity market, in which the government procures supply capacity giving an economic balance between up-front procurement cost and possible future unreliability costs.
This work arises from a long-standing collaboration between Dent and Zachary as consultants to National Grid, which began in 2011 when Zachary was a Reader at Heriot-Watt University and Dent was working at Durham University. Thus all works with Zachary as author up until his retirement in Sept 2015 are eligible. In 2016 Dent moved to the School of Mathematics at Edinburgh as a faculty member. Zachary was employed as a Senior Researcher at Edinburgh from Jan 2017 to Apr 2019 – during that time, where Dent was not an author on work, Zachary provided intellectual lead on the specialised work package that created the underpinning research cited here, and thus was an Independent Researcher in REF terms. Finally Wilson became a Faculty member at Edinburgh in September 2019 and thus has been an Independent Researcher from that date.
This REF2021 impact study is based on three specific lines of research:
Statistical assessment of the probability distribution of (demand minus wind). This guides what other resources such as coal and gas generation, and storage, must provide. Wilson, Zachary and Dent developed a method for assessing the statistical relationship between demand and available wind capacity, through using temperature as a proxy for demand due to complex patterns complicating direct quantification of the demand-wind relationship [3.1].
Inclusion of storage in adequacy risk assessment. Until 2016, all resources offering into the capacity auction had available capacities determined by mechanical availability (e.g. gas, coal and nuclear energy). This implied that units’ mean available capacities could be used as the product traded in the market. From 2016 it was necessary to develop an approach for including storage, which has finite energy capacity, and this was first applied in the 2018 auction. Dent, Wilson and Zachary demonstrated that if risk indices measuring impact of shortfall events in terms of duration of event are used, as had been used previously in the GB capacity assessment, this would not give appropriate relative credit to storage and conventional units. Instead, an index based on energy demand unserved should be used, and storage should be credited with its marginal value in reducing risk when added to the background of the other units procured [3.2, 3.3].
Decision analysis for capacity procurement. “Least worst regret” (LWR) analysis is used by National Grid for decision support on the volume of electricity capacity to procure in GB. Wilson and Zachary developed a hybrid LWR-probabilistic approach, which, while maintaining the same general framework, allows assignment of low probabilities to very extreme scenarios and mitigates the unduly high dependence of standard LWR on just two extreme scenarios [3.4].
This work has achieved considerable prominence in the applied and industry community, e.g. invited speaker sessions organised by Dent at the 2018 and 2020 International Probabilistic Methods Applied to Power Systems conferences; Dent Chairing for 2 years the IEEE Power and Energy Society Resource adequacy working group (which shares experience between relevant industry studies); Dent giving a keynote tutorial at the 2019 North American Electricity Reliability Corporation probabilistic methods conference; Dent and Zachary leading on industry connections in preparation for the 2019 “Mathematics of Energy Systems” programme at the Isaac Newton Institute. In 2020, Dent was appointed Standards Officer for the IEEE Analytical Methods for Power Systems Committee, overseeing all IEEE standards work relating to power system reliability.
3. References to the research
[3.1] A.L. Wilson, S. Zachary and C.J. Dent, “Use of meteorological data for improved estimation of risk in capacity adequacy studies”, 2018 Conference on Probabilistic Methods Applied to Power Systems, https://doi.org/10.1109/PMAPS.2018.8440216.This contains the content from an April 2015 report to National Grid on which the relevant impact is based.
[3.2] “Assessing the contribution of nightly rechargeable grid-scale storage to generation capacity adequacy”, G. Edwards, S. Sheehy, C.J. Dent and M.C.M. Troffaes, Sustainable Energy, Grids and Networks 12, 69-81 (2017). https://doi.org/10.1016/j.segan.2017.09.005
[3.3] “The integration of variable generation and storage into electricity capacity markets”, S. Zachary, A.L. Wilson and C.J. Dent (2018). Second round of reviews for Sustainable Energy, Grids and Networks. https://arxiv.org/abs/1907.05973
[3.4] “Determination of electricity capacity-to-secure: Sensitivities and decisions”, S. Zachary and A.L. Wilson (2017). Technical report to National Grid, pdf supplied.
4. Details of the impact
Impact on Great Britain Capacity Market
A reliable electric power supply underpins almost all of modern life (see https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/48129/2176-emr-white-paper.pdf) – thus ensuring an appropriate level of security of supply is critical for society. This societal significance is confirmed by testimonial letters from National Grid, and from a former board member at Ofgem (the GB energy regulator) who was responsible for this area [5.1, 5.2]. The direct cost of capacity procurement (i.e. the cost feeding to consumer bills) in the main 4 year ahead capacity auctions has ranged from GBP290,000,000 (for delivery in 2022/23) to GBP1,180,000,000 (delivery 2020/21) [5.3].
The Maxwell Institute team of Dent, Wilson and Zachary have been the main academic consultants to National Grid (NG) on risk modelling and decision analysis methodology since 2011, and thus for the entire REF period. This is evidenced by the specific points of impact below. The ongoing broad significance of the decisions based on our research is confirmed by NG’s 2019 Capacity Assessment report [5.4a]: on pages 22-25, this references in broad terms the significance of the Maxwell Institute’s research-based advice (references to unnamed ‘academic consultants’ refer to us).
The reach of the impact is GB-wide. The Maxwell Institute work underpins the decision support approach of NG by the Department of Business, Energy and Industrial Strategy (BEIS) in this nationally vital area of procuring capacity to ensure electricity security of supply.
The specific items of GB impact in this study fall into two strands:
1. Choice of scenarios considered and decision analysis approach used. Our decision analysis work [3.4] was used as described in National Grid’s 2017 Capacity Assessment Report [5.4b] to confirm the choice of methodology for supporting the decision on volume of capacity to procure in 2017/18 auction, for delivery in 2021/22, an auction worth GBP423,000,000 [5.3]. The 2019/20 auction for delivery in 2023/24, with design based similarly on our work, was worth GBP699,000,000. In particular, pages 17, 39 and 87-97 of the 2017 report [5.4b] describe how our work guided the choice of scenarios (particularly extreme optimistic and pessimistic scenarios) used in decision support for capacity procurement, through the demonstration in [3.4] of how the LWR analysis outcome depends on these extreme scenarios.
BEIS had been concerned about whether sufficient account was being taken of very pessimistic scenarios of planning background by the LWR approach, and we were asked to investigate whether including more extreme scenarios with an appropriate weighting scheme could improve decision making. We showed that the outcome was not very sensitive to addition of further extreme scenarios, and thus BEIS decided to go forward with their incumbent analytical approach given appropriate choice of scenarios. The significance to BEIS of this confirmation of performance of their approach is confirmed by BEIS’s Panel of Technical Experts [5.5, page 46] who note that ‘ The analysis for National Grid by Wilson and Zachary formalises the concerns we have previous expressed about the LWR methodology’ and ‘ we were grateful that National Grid did conduct a trial run to assess the impact of adopting such a hybrid approach’.
National Grid’s 2016 Electricity Capacity report [5.4c] confirms (page 30) that our work [3.1] on the statistical relationship between electricity demand and available wind capacity formed the basis for inclusion of a ‘low wind sensitivity’ in their decision analysis, due to historic data suggesting the possibility of wind power availability typically being slightly poorer at times of highest demand. This resulted in a need for an additional 0.8 GW of other capacity to give the appropriate level of security of supply, equivalent to an extra large gas-fired power station, and at a clearing price of GBP22.50/kW giving a direct additional expenditure of about GBP18M. This is an example of where improved analysis can demonstrate the need for increased up-front expenditure in order to give an appropriate level of security of supply.
The scale of this impact on decision support is the whole monetary value of the capacity market, and the nationally significant issue of procurement of generating capacity for security of supply, in that the methodology directly supports the top level decision of volume of capacity to procure.
2. Determining how to treat energy storage in the capacity market. In 2016 it was necessary to update at short notice the approach to crediting battery storage in capacity auctions. Under the existing rules, batteries that would only discharge at maximum output for up to 1 hour had been credited with 95% of their capacity, in analogy to the nearest existing technology of pumped storage hydro (which can discharge at maximum capacity for much longer, and cover the whole duration of the early evening demand peak). Based on our work in [3.2] and [3.3], National Grid and the government changed the risk index used for valuing storage from duration-based to energy-based, so as to make a fair comparison between storage and other resources as described above. They also made the specific decision to use the marginal value of the resource, when added to the rest of the portfolio procured, to credit emerging technologies such as storage and wind energy within the calculations.
The executive summary of [5.6] states “ We also acknowledge the benefits of discussions with academic experts from The University of Edinburgh on a number of high level issues, notably in relation to the choice of risk metric and the definition and calculation of the storage EFC”, with the Maxwell Institute report based on research in [3.2] and [3.3] included as Appendix 3. Following implementation of the recommendations in this report, battery units have been credited with as little as 10% of their capacity, versus the previous 96% [5.6, page 29]. The capacity of battery and similar storage awarded contracts in the 2019/20 auction for delivery in 2023/24 is 0.18 GW [5.3, page 3 of that year’s report]. Without the revised capacity credit factors based on our research this would have been over-credited by more than 0.1 GW, a direct overpayment to storage of approx. GBP2,000,000 at the market price of GBP16/kW/year. The supply capacity available to the system to support reliability would effectively have been lowered by this amount.
More importantly than these figures for the 2019 auction is the basis our work provides in supporting inclusion of storage in the capacity market on an ongoing basis, which in turn supports national plans for increased use of storage to the benefit of consumers while maintaining security of supply. Thus the annual capacity and monetary figures will grow considerably as storage becomes more significant on the scale of the whole system.
Verification of GB impact through testimonial letters. The EMR Modelling Manager from National Grid [5.1] commented “ *What the research has enabled is the ability to quantify the statistical relationship between wind and demand at times of high demand, improving the robustness of our modelling, which had previously used an assumption of independence.*” A former Board Member of Ofgem [5.2] said “ The work of Dent, Wilson and Zachary in this field has been significant in promoting … statistical techniques…. and has informed the assessment of capacity requirements in the capacity market and the assessment of capacity adequacy”. The letters also contain their own statements of the importance of reliable electricity supply and the significance of our research in underpinning analysis supporting the capacity market.
Impact on development of IEEE Standard 859
Dent was chosen to chair the recent revision of IEEE Standard 859 on “Standard Terms for Reporting and Analyzing Outage Occurrences and Outage States of Electrical Transmission Facilities”, published as IEEE 859-2018 [5.7]. IEEE 859 is widely used in the N American industry, in particular underpinning the continent-wide Transmission Availability Data System run by the North American Electricity Reliability Corporation (NERC), which records reliability data from the majority of utilities, and by other industry organisations internationally.
According to the letter [5.8], based on the data collected under guidance of IEEE 859, “ Industry and regulators use the resulting transmission outage statistics to determine best investments, and societal impacts resulting from transmission performance”. In NERC’s own geographical scope, this supports reliable supply to nearly 400 million people. In the USA alone, the significance of the use of data collected based on this standard is shown by annual investment in electricity networks at USD45,000,000,000 annually in 2016 and on an increasing trajectory [5.9]; and the US Government estimating annual costs of weather-related outages (see point (ii) below) standing at between USD18,000,000,000 and UDS33,000,000,000 annually in the decade to 2012 [5.10]. The US Energy Information Agency explicitly credits IEEE standards including 859 as underpinning relevant statistics [5.9] (“Many of the standards in reporting these metrics were initially developed by the IEEE” – IEEE standards 762 and 859).
As well as his general leadership based on research and industry experience, Dent made a number of specific contributions based on the above research [5.7]:
Note after 4.2.2.2.2 “Multiple Independent Outages”: clarifies, based on insights from [3.1-3.3] above, the concept of statistical (in)dependence in circumstances where failure rates are elevated (e.g. due to bad weather). This has been a point of confusion in the power system research and industry literature for many years.
Note in 6.3 “Weather” and Note 1 in 6.3.1 “Adverse Weather”: clarifies how the classification of different weather states (normal, adverse etc) in the standard should be used in data recording, and how care should be taken using indirectly relevant data (again based on insights from [3.1-3.3]).
Note in 7.3 “State Probability Indices” clarifies the important conceptual point that the indices defined are empirical probabilities, i.e. “the proportion of a set of historic trials in which a particular event occurred”. This is closely related to the use of historic wind resource and demand data described in [3.1].
This section of the impact study is verified by the testimonial letter of the Chief Reliability Officer at the North American Electricity Reliability Corporation [5.8]. The letter further describes the international significance of IEEE 859, and the contribution of our research in underpinning the revisions in IEEE-859-2018 described above.
5. Sources to corroborate the impact
[5.1] Letter from EMR Modelling Manager at National Grid ESO.
[5.2] Letter from former Board Member at Ofgem.
[5.3] National Grid ESO, Capacity Market Published Round Results, https://www.emrdeliverybody.com/CM/Published-Round-Results.aspx
[5.4] Electricity Capacity Reports from (a) 2019, (b) 2017 and (c) 2016, National Grid ESO,
[5.5] Government Panel of Technical Experts’ 2017 report on the capacity market, https://www.gov.uk/government/publications/electricity-market-reform-panel-of-technical-experts-2017-final-report-on-national-grids-electricity-capacity-report-2017
[5.6] Duration-Limited Storage De-Rating Factor Assessment, National Grid ESO, 2017, https://www.emrdeliverybody.com/Lists/Latest%20News/Attachments/150/Duration%20Limited%20Storage%20De-Rating%20Factor%20Assessment%20-%20Final.pdf.
[5.7] IEEE Standard 859-2018, https://standards.ieee.org/standard/859-2018.html.
[5.8] Letter from Chief Reliability Officer at the North American Electricity Reliability Corporation.
[5.9] Today in Energy, US Energy Information Administration. Annual investment figures from https://www.eia.gov/todayinenergy/detail.php?id=36675 and link therein to transmission investment. Reference to IEEE 859 at https://www.eia.gov/todayinenergy/detail.php?id=35652.
[5.10] “Economic benefits of increasing electric grid resilience to weather outages”, President’s Council of Economic Advisors and US Department of Energy, 2013, https://www.energy.gov/downloads/economic-benefits-increasing-electric-grid-resilience-weather-outages.
- Submitting institution
- Heriot-Watt University, University of Edinburgh (joint submission)
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Technological
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
Format Solutions is the world’s leading supplier of software for food formulation, and this requires high performance optimization software. The solutions obtained using the EMSOL and HiGHS optimization software developed by Hall and colleagues enabled Format Solutions’ clients to make savings [text removed for publication] of ingredient costs, [text removed for publication]. The reach of this impact is global, and its significance is demonstrated by the dependence of Format Solutions on the University of Edinburgh technology when selling its software [text removed for publication].
2. Underpinning research
The need for high-performance solution of large-scale linear programming (LP) problems is widespread throughout industry, the public sector, and scientific world. The development of computational and algorithmic techniques for solving this optimal decision-making problem has driven Hall’s research for over thirty years. When studying the behaviour of Hall’s simplex solver (EMSOL) on an LP problem provided by Format International, Hall and McKinnon identified the property of hyper-sparsity [3.1] that is exhibited by many important classes of LP problems. Developing techniques to exploit hyper-sparsity led to major improvements in the performance of EMSOL. In the case of Format’s LP problems, the solution time for their larger, more representative test problem (DCP2) improved by a factor of more than five, and the cost of post-optimal analysis was reduced by a factor of more than eight [3.1]. In the early 2000’s, one client of Format wanted to solve “pooling problems” so, to solve the underlying nonlinear optimization problem, Grothey and McKinnon developed a sequential linear programming (SLP) solver [3.2] that used EMSOL to solve the underlying sequence of LP problems.
In 2009, Hall recommenced research into high performance implementations of the simplex algorithm. His PhD student, Qi Huangfu, was partially funded by Format to work in the area, leading to the development of serial and parallel implementations of the simplex algorithm that offered significant performance improvements over EMSOL and, academically, represent the state of the art [3.3-3.4]. These solvers, together with LP pre-processing techniques, now form the core of the HiGHS open source linear optimization software. HiGHS solves Format’s indicative test problem DCP2 a little over ten times faster than EMSOL. Independent of advances in machine speed, HiGHS is three orders of magnitude faster than the simplex solver originally written by Format, with post-2000 research and development accounting for a factor of more than 50.
3. References to the research
[3.1] “Hyper-sparsity in the revised simplex method and how to exploit it”, J. A. J. Hall and K. I. M. McKinnon, Computational Optimization and Applications 32(3), 259-283, 2005. https://doi.org/10.1007/s10589-005-4802-0
[3.2] “On the Effectiveness of Sequential Linear Programming for the Pooling Problem”, A. Grothey and K. McKinnon, Submitted to Annals of OR, 2020. arXiv:2002.10899.
[3.3] “Novel update techniques for the revised simplex method”, Q. Huangfu and J. A. J. Hall, Computational Optimization and Applications 60(3), 587-608, 2015. https://doi.org/10.1007/s10589-014-9689-1
[3.4] “Parallelizing the dual revised simplex method”, Q. Huangfu and J. A. J. Hall, Mathematical Programming Computation, 10 (1), 119-142, 2018. https://doi.org/10.1007/s12532-017-0130-5
Articles 3.1, 3.3 and 3.4 won the journal’s best paper prize for that year. See http://users.clas.ufl.edu/hager/coap/Best/ [3.1,3.3] and https://www.springer.com/journal/12532/updates/17226372 [3.4]
4. Details of the impact
The blending of simple raw materials to manufacture food is an industry that is worth approximately a hundred billion USD per annum, and is dominated by the petfood market, where small percentage savings are significant. The competitive position of the world’s leading supplier of food formulation software, Format Solutions, depends on techniques and software developed by the University of Edinburgh Optimization group. This software is used to identify raw material blends and production strategies [text removed for publication]. With the solution of optimization problems being the overwhelmingly dominant computational cost of Format Solutions’ software, post-2000 research and development by the Edinburgh group has led to its performance improving by more than an order of magnitude (independent of the computing platform).
Manufactured food for both farmed animals and domestic pets, as well as some human food, is produced by simple blending and binding of fundamental raw materials. A particular food mill will produce multiple recipes for different animals, with shared use of the same raw materials. However, the nutritional properties of the resulting food are important, particularly when raising farm animals to provide optimal yields of meat, milk and offspring. Thus, the amounts of raw materials that are combined to produce a particular recipe must satisfy many linear constraints on its nutritional properties, and the total amounts of raw materials used must not exceed their availability. Food manufacturers wish to manufacture their products at minimum price. This extension of the classical single diet problem is an example of the optimal decision-making problem that, mathematically, is referred to as linear programming (LP). The problem size increases with the number of recipes and raw materials, and larger companies wish to consider raw material usage over a number of mills. A secondary, but potentially very profitable enterprise is to consider substituting materials when it is more valuable to trade ownership or purchase options for them on global commodity markets. To obtain the information for all this manufacturing and economic activity requires the solution of large-scale LP problems.
To place the post 2013 impact in context, it is necessary to consider the involvement of Hall, McKinnon and Grothey of the University of Edinburgh Optimization group with Format International (as it was known at the time), that began in 1997. The company was seeking to improve the quality of its implementation of the simplex algorithm for solving the LP problems generated by its animal feed formulation software. It was immediately clear that Hall’s simplex solver, EMSOL, was far superior. For problems with a large matrix size, EMSOL returned the optimal solution up to 15 times faster than Format’s own version, so Format bought a copy of EMSOL for incorporation in their software. In the early 2000’s, Hall’s development of techniques to exploit hyper-sparsity in the type of problems Format needed to solve provided a further major performance enhancement. The result of adopting EMSOL had significant impacts on the company’s growth. It was able to win new business with larger clients [text removed for publication].
The SLP solver developed for Format by McKinnon and Grothey gave the company the unique ability to solve pooling problems, in which production requirements dictate that ingredients are used to produce several pre-mixed combinations, before final blending into products. Such problems are often found in industries such as pet food and dried milk production. The development therefore gave Format a significant commercial advantage over its competitors, who simply were unable to solve these types of problems. The solutions presented by the new solver enabled Format clients to streamline production and supply chains resulting in significant savings to them [text removed for publication]. Format was therefore able to open new markets and win new business, particularly in the pet food industry. This included several of the world’s leading pet food manufacturers [text removed for publication].
In 2015, Cargill Inc., one of the world’s leading animal feed manufacturers and a long-time customer of Format, acquired Format International, renaming it Format Solutions. This purchase was made following a review of the future direction of their formulation software solutions across the world-wide business. The Cargill review identified that Format’s own software developments, coupled with its optimization capabilities and the talent in the Format team were far ahead of Cargill’s own in this area, and to match them would have taken many years and expense. The purchase of Format International, bringing with it the ability to acquire the software, its I.P. and solvers, was assessed as a far better option than developing their own solution. The Format-Edinburgh relationship spanned many years of highly rewarding interaction and resulted in some industry significant developments. Format’s turnover and profitability during the period grew four-fold [text removed for publication].
Format Solutions is now (2020) the world’s leading supplier of food formulation software, so is dependent on high performance optimization software for the reliability and performance of its products. As the modelling capabilities of its software have developed over the years, so has the size of the underlying optimization problems that must be solved. Continued collaboration with the Edinburgh Optimization Group has given Format Solutions the necessary enhancements of its optimization solvers. The most significant advance in this respect has been the performance improvement offered by HiGHS, that has been used by Format since 2017. In the case of LP problems, Format’s customers are now able to “optimize larger feed formulation problems, and at significantly greater speed” [5.2]. The Format product for pooling problems that requires the SLP solver “enables customers to significantly improve on the cost and quality of their feed-mix products and solve new classes for problems that were not tractable before” [5.2]. This has “contributed to Format’s world market-leading position and allowed them to gain new important clients” [5.2]. A significant illustration of this is the fact that Format’s software is used to formulate petfood for the world’s top manufacturer [text removed for publication] who chose Format specifically because of this solution capability [5.2]. HiGHS offers Format Solutions a tenfold performance improvement over EMSOL, and will contribute significantly to maintaining Format’s world market-leading position in feed formulation software into the future.
- Submitting institution
- Heriot-Watt University, University of Edinburgh (joint submission)
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Economic
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
The 2011 UK census revealed issues with the accuracy of the underlying population data that were of considerable concern to the pensions and life insurance sectors. This led directly to the work of Cairns et al. (2016), who developed methods for identifying anomalies in national population and mortality data.
The work has impacted on institutions in the UK, US and France. Results of the research have enabled insurers to reduce prices for the transfer of pension liabilities, saving UK pension funds between GBP330,000,000 and GBP1,000,000,000. It has persuaded actuaries to revise the mortality tables that they use for pricing and reserving including changes in the methodology underpinning the UK actuaries’ Continuous Mortality Investigation (CMI) mortality projection tables.
2. Underpinning research
The 2011 census revisions to England & Wales (E&W) population estimates by the Office for National Statistics (ONS) drew attention to the possibility that there are widespread issues in how population data are measured and reported in many countries. Cairns (Maxwell Institute) along with Blake (City University) and Dowd (Durham) were approached by the [text removed for publication] to investigate these issues further and propose how to correct anomalies in population and mortality data.
This collaboration led to the paper by Cairns et al. (2016) [3.1] (hereafter CBDK after the four authors) who highlighted potential anomalies in English and Welsh (E&W) population data and how these data are used in the calculation of mortality rates. A key discovery was that an uneven pattern of births within a given calendar year is a major cause of anomalies in population and exposures data decades later. The term exposures refers to the average population at a given age over the whole of a calendar year, but, in many countries such as E&W, exposures are approximated by the corresponding mid-year population estimate: and it is this approximation that can lead to significant errors in published death rates (deaths divided by exposures). Different countries or agencies might derive population estimates and exposures in different ways. But, however they do this, the estimated exposures will be subject to potentially significant errors, unless they take into account any significant irregular patterns in monthly or quarterly births data.
As one example, in the competitive life annuity market, profit margins tend to be quite small. Consequently, a 1% error, say, in the price of an annuity due to errors in the exposures data could lead to a 25% error in anticipated profits. [3.1] identified two cohorts in E&W with errors that exceed 1%.
The researchers developed a range of methodologies to help identify specific errors in population, exposures and deaths data:
graphical diagnostics providing a powerful model-free toolkit for identifying anomalies in the form of signature plots;
a Bayesian framework enabling the size of these anomalies to be quantified and corrected;
two-dimensional diagnostics enabling the detection of small systematic errors in exposures and deaths of less than 1%.
Using these methodologies, [3.1] showed that ONS population data do contain significant anomalies of up to 9% in both mid-year population estimates and exposures that need correction.
The researchers [3.1], additionally, developed the cohort–births–deaths exposures methodology which can be used to explain many of the bigger errors. The first component was the convexity adjustment ratio which can be used to explain how persistent cohort-related errors arise when exposures are equated to mid-year population estimates. The second component was a methodology for deriving improved mid-year population estimates in census years from census data. Other anomalies were identified but only partially explained by the cohort-births-deaths methodology. Significantly, postcensal population estimates were demonstrated in the paper to magnify anomalies (e.g. phantoms) arising in census years.
In addition, anomalies were identified that are due to potential small biases in the reported age at death, and use of the Kannisto–Thatcher high-age methodology, resulting in a discontinuity at age 90 years.
Finally, [3.1] developed a holistic approach, not reliant on births data, to estimate and correct errors in population data.
Collectively, these errors can make substantial differences, particularly in respect of cohorts that are still sufficiently large to have a significant financial effect, e.g. the impact of anomalies in the 1947 cohort on pension liabilities.
The same types of errors – with possible variants – are now known to apply to other countries: some similar to E&W in deriving their population data from periodic (typically decadal) censuses. As one example, data for France (see [5.8]) share similar characteristics and reveal similar anomalies to those for E&W.
Cairns led the mathematical and statistical work in the paper.
3. References to the research
[3.1] Cairns, A.J.G., Blake, D., Dowd, K., and Kessler, A.R. (2016) Phantoms Never Die: Living with Unreliable Population Data. Journal of the Royal Statistical Society, Series A, 179: 975-1005. DOI: 10.1111/rssa.12159
4. Details of the impact
Beneficiaries and Types of Impact
The results of this research are being used by or have had an impact on companies and institutions in the UK, US and France: [text removed for publication]; the Continuous Mortality Investigation of the Institute and Faculty of Actuaries (CMI); actuarial consultancies advising insurers and pension funds; UK pension funds; insurers and reinsurers; the Office for National Statistics (ONS). CBDK [3.1] has impacted on: professional practice; published mortality tables; pricing of billions of pounds of longevity transactions, with a documented/verifiable pathway to impact in each case.
Pathway to impact
The authors’ original and longer report was completed in December 2013, and was followed by a series of presentations by Cairns and co-authors to key stakeholders to kickstart generation of impact from the research. Presentations included US practitioners (New York), the US Social Security Administration (Baltimore), United Nations Population Directorate (New York), members of the Continuous Mortality Investigation (London), mortality experts at the Office for National Statistics (Titchfield), actuaries’ Life Conference (Birmingham), and the International Mortality and Longevity Symposium (Birmingham).
[text removed for publication]
The impetus for CBDK originated from [text removed for publication] who were concerned about the accuracy of ONS population data. The results in CBDK gave [text removed for publication] the confidence to revise their mortality tables and reduce prices. Since publication of the paper, [text removed for publication] have used the results in all of their UK transactions. [text removed for publication] supporting letter [5.1] states: “ *As a direct result of this study, my team revised [their] mortality tables … [and] have reduced the price charged by [text removed for publication] by nearly 1%*”. She continues to note that [text removed for publication] l have executed transactions worth [text removed for publication] since 2016, resulting in savings to UK pension funds and insurers of approximately [text removed for publication]. “ *Moreover, we believe that the entire market has taken these revised tables into account … [with] estimated savings … nearing £1 billion.*”
Office for National Statistics
The ONS conducted a methodological review in 2016 of official high-age population estimates. One of the key drivers for the review was CBDK [3.1]. (See ONS, 2016, page 4; [5.2].)
Continuous Mortality Investigation (CMI)
The CMI conducts mortality investigations on behalf of the UK actuarial profession and produces mortality tables and forecasts of mortality improvements that are extensively used by UK pension funds and insurers when they value their liabilities.
Methods and results in CBDK have been used extensively in two key work streams (mortality projections and high-age mortality) of the CMI that are clearly documented in the CMI working paper (WP) series.
CMI Working Paper 91 (WP-91) [5.3] (pages 4, 5, 11, 56, 57) and its successors concern the CMI’s approach to projecting mortality improvements. These require use of a table of historical mortality rates. The results of CBDK convinced the CMI that anomalies could have a material impact on mortality projections and so, inspired by CBDK, they developed a simplified method to adjust for anomalies. The CMI’s flagship projection tool itself [5.4] gives users the choice of whether or not to use the adjusted or unadjusted population data, but use of the adjusted data is the default.
A further impact of CBDK on the work of the CMI is evidenced in WP-100 [5.5] (pages 3, 13, 15, 18, 19, 39-49, 73-78 including references to “CBDK” diagnostics). This documents the CMI’s work to improvements in single age mortality rates at very high ages, and makes extensive use of the CBDK diagnostic tools 1, 2A and 2B. [5.5] builds on the conclusions of CBDK that the ONS data have a small discontinuity at age 90, and developed their own methodology to correct this.
The CMI projections model, including its high age methodology are used extensively by life insurance and pensions actuaries in setting best estimate mortality improvement assumptions (e.g. see the USS 2018 Valuation [5.6], page 12). These assumptions impact on the valuation of trillions of pounds of insurance and pension liabilities. A decrease of only 0.1% per annum (quite common in recent years) to the mortality improvement rate in CMI_2018 will take tens of billions off these liabilities.
[text removed for publication]
[text removed for publication] is among the world's largest providers of actuarial services. Their supporting letter [5.7] [text removed for publication discusses the impact CBDK has had internally and how it has impacted on their clients.
The work of CBDK “initiated more than four years of research in [text removed for publication]”;
Work with a major client, [text removed for publication], led the client “to update its own mortality tables, …. leading to a change in [their] capital calculation modelling.”
CBDK more generally has had a strong impact on consulting, including increased awareness of data quality issues, and, through CBDK-inspired adjustments to data, have had a significant impact on the calibration of stochastic mortality models: calibrations that have an impact on regulatory capital requirements for insurers.
The supporting letter [5.7] also emphasizes the impact of CBDK on the Human Mortality Database (HMD) (a much-used source of mortality data by multinational insurers) which revised its exposures methodology in 2018.
5. Sources to corroborate the impact
[5.1] Letter of support [text removed for publication].
[5.2] Office for National Statistics (2016) Accuracy of official high-age population estimates, in England and Wales: an evaluation. https://www.ons.gov.uk/peoplepopulationandcommunity/birthsdeathsandmarriages/ageing/methodologies/accuracyofofficialhighagepopulationestimatesinenglandandwalesanevaluation
[5.3] UK Continuous Mortality Investigation (CMI), CMI Working Paper 91 (2016) CMI Mortality Projections Model consultation – technical paper, Mortality Projections Committee. https://www.actuaries.org.uk/system/files/field/document/CMI%20WP091%20v01%202016-08-31%20-%20CMI%20Model%20consultation%20technical%20paper_0.pdf
[5.4] CMI_2019 The CMI Mortality Projections Model; an Excel-based toolkit. Information describing the toolkit is available at https://www.actuaries.org.uk/learn-and-develop/continuous-mortality-investigation/cmi-working-papers/mortality-projections/cmi-working-paper-129.
[5.5] CMI Working Paper 100 (2017) A second report on high age mortality, High Age Mortality Working Party, with additional Supplementary Technical Paper. https://www.actuaries.org.uk/learn-and-develop/continuous-mortality-investigation/cmi-working-papers/other/cmi-working-paper-100
[5.6] Universities Superannuation Scheme (USS) Scheme funding report of the actuarial valuation of the universities superannuation scheme as at 31 March 2018 (Page 12). www.uss.co.uk/about-us/valuation-and-funding/2018-valuation
[5.7] Letter of Support [text removed for publication].
[5.8] [text removed for publication]
- Submitting institution
- Heriot-Watt University, University of Edinburgh (joint submission)
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Legal
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
Our judicial system increasingly relies on quantifying the value of evidence presented in court. As a result, advanced statistical methods have a strong impact on the administration of justice. Research by Aitken, Wilson (both Maxwell Institute, MI) and collaborators has applied Bayesian statistics to develop methodology for quantifying judicial evidence. They proposed and implemented procedures for evaluating forensic evidence from (i) multivariate hierarchical data and (ii) autocorrelated data. The procedures developed are now routinely used in forensic laboratories worldwide; the methods have been recommended in international guidelines for forensic scientists and have been used to support the accreditation of a UK laboratory. The underpinning research has been cited in expert witness reports in court cases worldwide. Therefore, the beneficiaries of the research include both forensic scientists and the justice system.
2. Underpinning research
The key research insight is the recognition that the Bayesian framework provides the tools needed for the interpretation of forensic evidence. This has led to the development of increasingly sophisticated statistical analyses driven by new measuring equipment for the examination of trace evidence and by the increase in computing power that enables the lengthy calculations required to be performed efficiently. Papers published from 2004 have contributed to this development and tackled two important problems: the treatment of multivariate, hierarchical evidence data and the evaluation of evidence in the form of autocorrelated data.
(i) Likelihood ratios for multivariate hierarchical data. When samples of material obtained from a crime scene are compared with those obtained from a suspect, it is necessary to quantify the support for the proposition that they come from the same source. In many cases the data characterising the material are multivariate, continuous and hierarchical. Examples include the composition of glass taken from fragments of windows. The hierarchical nature then arises because variations within-source and between-source differ (variation of glass composition in a single windowpane versus variation between different panes). Research in the MI developed a Bayesian methodology to quantify the value of the evidence derived from such multivariate and hierarchical data. This overcame the drawbacks of earlier methodologies (which often incorrectly assumed the independence of the different variables) by providing a likelihood ratio (LR) that can be combined with other forms of evidence in an integrated analysis and leads to readily interpretable conclusions. The initial work [3.1] considering a two-level hierarchy of data was extended to a three-level hierarchy in [3.2]. The paper [3.3] also developed an implementation based on graphical modelling techniques which is adapted to multivariate data. The methodology is described in [3.4]; a 3rd edition has just been published.
(ii) Likelihood ratios for autocorrelated data. This research was motivated by the need for methods to quantify the value of evidence relating to drugs on banknotes. Banknotes can be seized from crime scenes as evidence for suspected association with illicit drug dealing. Tandem mass spectrometry data are available from banknotes seized in criminal investigations, as well as from banknotes from general circulation. The aim of this research was to evaluate the support provided by the data gathered in a criminal investigation for the proposition that the banknotes were associated with a criminal activity related to cocaine in contrast to the proposition that the banknotes were from general circulation. Previous methods for assessment of the relative support for these propositions were concerned with the percentage of banknotes contaminated or assumed independence of measurements of quantities between adjacent banknotes. The research developed new methodologies for evaluating this support using the LR. These methods accounted for autocorrelation in the data caused by transfer of cocaine between banknotes and also modelled differences in contamination between different bundles of notes [3.5]. It has been argued in court that the datasets used for evaluating the evidence are inappropriate because there may be variability across the country in levels of cocaine on banknotes. In order to implement the methods developed in practice, further research [3.6] showed that there is no meaningful difference in quantities of cocaine on banknotes in different regions of Great Britain and hence there is no need to tailor the datasets to the region of the crime.
Additional research on the communication and interpretation of statistical evidence in the administration of criminal justice resulted in an interdisciplinary collaboration [3.7] designed to bring Bayesian ideas of the likelihood ratio and Bayesian networks to the attention of judges, lawyers, forensic scientists and expert witnesses.
3. References to the research
[3.1] Aitken, C.G.G. and Lucy, D., Evaluation of trace evidence in the form of multivariate data. Applied Statistics, 53, 109-122 (2004). https://doi.org/10.1046/j.0035-9254.2003.05271.x
[3.2] Aitken, C.G.G., Lucy, D., Zadora, G. and Curran, J.M., Evaluation of transfer evidence for three-level multivariate data with the use of graphical models, Computational Statistics and Data Analysis, 50, 2571-2588 (2005). http://dx.doi.org/10.1016/j.csda.2005.04.005
[3.3] Aitken, C.G.G., Zadora, G. and Lucy, D., A two-level model for evidence evaluation. Journal of Forensic Sciences, 52, 412-419 (2007). http://dx.doi.org/10.1111/j.1556-4029.2006.00358.x
[3.4] Aitken, C.G.G. and Taroni, F., Statistics and the evaluation of evidence for forensic scientists, John Wiley and Sons Ltd (2004), 2nd edition, http://dx.doi.org/ 10.1002/0470011238 and (2021), 3rd edition http://dx.doi.org/ 10.1002/9781119245438
[3.5] Wilson, A., Aitken, C.G.G., Sleeman, R. and Carter, J. The evaluation of evidence for auto-correlated data in relation to traces of cocaine on banknotes. Applied Statistics. 64, 275-298 (2014) http://dx.doi.org/10.1111/rssc.12073
[3.6] Aitken, C.G.G., Wilson, A., Sleeman, R., Morgan, B., Huish, J. Distribution of cocaine on banknotes in general circulation in England and Wales. Forensic Science International, 270, 261-266 (2016). http://dx.doi.org/10.1016/j.forsciint.2016.10.017
[3.7] Four Practitioner Guides for interpreting statistical evidence published by the Royal Statistical Society on “Communicating and Interpreting Statistical Evidence in the Administration of Criminal Justice” (2010-2015)
4. Details of the impact
The research has had an impact on the administration of justice, leading to a better use of evidence and accompanying judicial and economic benefits. This is split into three main areas:
(i) The procedures developed in [3.1-3.3] are now routinely used for forensic casework internationally.
This is confirmed by various forensic experts from across Europe: “The work of Aitken and Lucy … is the basis of our method for e.g. calculating LRs in glass and is still used routinely” [5.1]; “ His [Aitken’s] research papers and books are used as teaching material, [and] reference material to justify approaches in practical works such as forensic reports” [5.2]; “ Results of Aitken’s research, …are used in my daily forensic practice as a forensic expert in microtraces, i.e. in cases involving the analysis of glass fragments. I use it (… LR model) for the evaluation of the evidential value of results of glass analysis… The method is used in about seven cases per year…, about one hundred cases in total to date” [5.3].
A software package in the statistical programming language R known as ‘comparison’ which follows [3.1] was developed by Lucy and has been downloaded around 25,000 times since 2014, indicating the acceptance of the methods by the applied community.
(ii) Research on cocaine traces on banknotes in [3.5-3.6] has been used to support the accreditation of a UK forensic science laboratory and to support expert evidence delivered in UK court cases.
The Scientific Director, Mass Spec Analytical Ltd (MSA) confirms this [5.4]: “ *The work … is routinely referred to in the supporting material [in] every court statement sent out by [MSA]…. As such, it is frequently subject to cross examination in Court, and the expert witnesses make reference to the peer reviewed papers regularly. It is difficult to quantify, but the experience of the court attending witnesses is that ensuring that our offering is on a sound scientific footing has greatly reduced the ‘attacks’ on our methodology.*” and “ *the work demonstrating that the general circulation background samples do not vary greatly across the country has proved to be enormously useful following adverse criticism of the sampling strategy in the trial of Rashid et al in Sheffield Crown Court in 2015. This work is referred to in talks given to police officers for marketing purposes, and is included in the “In-house method validation” documentation provided to UKAS [the United Kingdom Accreditation Service] at Accreditation Assessment visits (the most recent being February 2020).*”
More broadly, the research in [3.5] has changed the way in which forensic scientists at MSA think about the presentation of their forensic evidence, resulting in improvements in the administration of justice [5.4]. MSA provided evidence for around 200 cases per year in the UK (most of the UK cases featuring this evidence type). Key Forensics purchased the banknotes part of the business in January 2020 and have continued using the research as described above. Further to the above, Aitken has given evidence as an expert witness based on research in [3.6] under oath in two trials: R. v. Hussain and others (Snaresbrook Crown Court) and R. v Parry and others (Liverpool Crown Court).
(iii) The research has influenced the framework in which scientific evidence is presented in court and is widely used to train forensic scientists and lawyers.
The book [3.4] is a well-cited authority on the role of statistics in the evaluation of evidence in forensic science. One of the main methodologies set out in the book is that of [3.1]. The influence of the book on forensic casework is illustrated in [5.1]: “ The Bayesian framework explained in the book is the basis for evidence interpretation and evaluation in the casework of the NFI” and [5.6]: “ The well-known text book… is the fundamental literature in this field”. This is further supported by the inclusion of the methodology set out in the book (e.g. paragraph 2.4 on p6) in the European Network of Forensic Science Institutes (ENSFI) Guidelines [5.7] which sets out a framework for reporting statistical evidence in forensic science. The book and methodology contained in it is used consistently by laboratories in ENFSI to train forensic scientists and lawyers [5.1, 5.2, 5.3, 5.6].
The practitioner guides [3.7] are on the communication and interpretation of statistical evidence in the administration of criminal justice. These set out the likelihood ratio methodology in [3.1-3.4] for legal practitioners, e.g. Sections 2.17-2.19 in the first guide and Section 2.21 in the fourth guide. The influence of [3.1], [3.4] and [3.7] on forensic casework is corroborated in [5.3]: “ *The general ideas on application of LR approach in forensic sciences as expressed in [3.1] and [3.4] (and disseminated in the practitioner guides of the Royal Statistical Society and the ENFSI guidelines) are used routinely by me in the evaluation of evidential value of results of blood pattern analysis.*” The practitioner guides have been used in court cases (e.g. in the Kentucky Supreme Court - Ivey v. Commonwealth in 2014), indicating that they add value to the presentation of expert evidence in court (also see [5.4]). The Royal Statistical Society and the Inns of Court College of Advocacy published an introductory guide to statistics for barristers and advocates [5.5] describing the general approach to evidence evaluation set out in [3.4] and [3.7] (e.g. Section 1.7 and p66-68). As the Chair, Aitken led the contribution from the Statistics and Law Section of the Society and Wilson contributed as a committee member. This guide was recognised as a useful resource for training lawyers and judges in a House of Lords Science and Technology Committee inquiry in 2019 on “Forensic science and the criminal justice system: a blueprint for change” [5.8, paragraph 132].
As evidence of the overall impact of the research in the forensic and legal communities, Aitken was awarded the Howard Medal of the Royal Statistical Society in 2018 [5.9] for work that is an “ outstanding example of how a statistician can integrate with those in a substantive area”. Reasons for the award included the research on cocaine on banknotes, [3.5, 3.6]. Further to this, [5.2] states “ *I can testify that Professor Aitken’s research in general has deeply influenced the way a scientist approaches the evaluation of evidence and the way he/she presents evidence in a written report or during a testimony in front of a Court of Justice.*”
5. Sources to corroborate the impact
[5.1] Letter of support from a Forensic Statistics expert at The Netherlands Forensic Institute in The Hague and Professor of Forensic Statistics (by special appointment) in the Institute of Mathematics at The University of Amsterdam.
[5.2] Letter of support from the Professor of Forensic Statistics at the Institute of Criminal Sciences at the University of Lausanne, Switzerland, the world's premier research institute in forensic science.
[5.3] Letter of support from Professor at the Institute of Forensic Research, Krakow and University of Silesia at Katowice, Poland
[5.4] Letter of support from the Scientific Director, Mass Spec Analytical Ltd., Bristol, UK
[5.5] “Statistics and probability for advocates: Understanding the use of statistical evidence in courts and tribunals” produced by the Royal Statistical Society and the Inns of Court College of Advocacy (2017)
[5.6] Letter of support from a Forensic Specialist in Statistics at the National Forensic Centre and Reader in Statistics at the University of Linköping Sweden.
[5.7] European Network of Forensic Science Institutes Guideline for Evaluative Reporting in Forensic Science (2014), https://enfsi.eu/wp-content/uploads/2016/09/m1_guideline.pdf
[5.8] House of Lords Science and Technology Committee, 3rd report of session 2017-19, “Forensic science and the criminal justice system: a blueprint for change”. https://publications.parliament.uk/pa/ld201719/ldselect/ldsctech/333/333.pdf
[5.9] Royal Statistical Society Howard Medal awarded to Colin Aitken in 2018, https://rss.org.uk/news-publication/news-publications/2018/general-news/rss-announces-recipients-of-2018-honours/
- Submitting institution
- Heriot-Watt University, University of Edinburgh (joint submission)
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Environmental
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
Mathematical modelling research by Prof White at the Maxwell Institute has been used to direct policy and practice to conserve red squirrels – a protected species in the UK – from squirrelpox virus carrying invasive grey squirrels. The research demonstrated that the red squirrel conservation policy used up to 2015 would not be sufficient to contain squirrelpox in the UK. Working with stakeholders, his research was the trigger for a radical change in policy to protect red squirrels in priority areas in the UK, directly informing the ‘Scottish Strategy For Red Squirrel Conservation (2015)’ and the Red Squirrel Survival Trust policy to protect red squirrels on Anglesey.
Prof White’s research has also evaluated red squirrel population viability under different forest management scenarios. This research underpins commissioned reports produced by Prof White for the Forestry Commission that have had a direct impact on current and future forest management practice in the UK that balance timber production and red squirrel conservation.
2. Underpinning research
Since its introduction into the UK, the grey squirrel has replaced the native red squirrel throughout most of England and Wales, and in parts of Scotland and Ireland. There are now only certain regions in which the red squirrel survives and maintaining these populations is a conservation priority. As such the red squirrel is a protected species in the UK (Wildlife and Countryside Act 1981) and was one of the first species identified for conservation under the UK Biodiversity Action Plan (Conserving Biodiversity – The UK Approach, DEFRA 2007). In addition to the threat grey squirrels cause to red squirrel conservation, they also negatively impact the health of the UK’s trees and woods through bark stripping, with greys causing significantly more damage than red squirrels. This has an impact on forestry and the UK’s policy to expand woodland cover. A European Squirrel Initiative report conservatively estimates a ‘GBP40 000 000 per annum timber loss from grey squirrels’ in England alone.
Prof White leads a research group based at the Maxwell Institute for Mathematical Sciences that works with ecological, conservation and forest management partners to develop spatial, deterministic and stochastic mathematical frameworks that can be used to address key issues related to red squirrel conservation. The mathematical modelling research provided evidence that squirrelpox infection was a key driver of the rapid replacement of red squirrels by greys in the UK (3.1). This study was the first to show that squirrelpox accelerates the process of replacement and therefore that disease was a causative factor in the decline of red squirrels in the UK. The deterministic model of competition and disease for the UK squirrel system (3.1) underpins a spatial, stochastic model that includes a representation of the real habitat composition taken from up-to-date digital habitat inventory maps (3.2-3.5). The models were developed in collaboration with conservation agencies and facilitated through a NERC Innovations Project Grant (NE/MO21319/1), allowing the modelling research to inform key policy and management decisions to protect red squirrels. This research work has been able to predict the spread of squirrelpox throughout Scotland and advise how grey squirrel control practice can be modified and targeted to specific locations to prevent squirrelpox expansion (3.2). The model system was also used to show that the exclusion of grey squirrels from priority regions for red squirrel conservation would be sufficient to protect reds from replacement (3.3). These novel results also showed that squirrelpox can spread from greys outside to reds inside a priority region leading to periodic outbreaks of infection (but not endemic persistence) in the red populations and that the red squirrel population density could return to pre-infection levels following an outbreak of infection (3.3).
The model complexity was increased to represent seasonal demographics for red squirrels and to include tree seed crop dynamics and was used to assess red squirrel population viability under different forest management (felling and restocking) scenarios for red squirrel reserves in England (3.4). Results highlighted how red squirrel survival could be significantly increased by felling and restocking adjustments, and improving connectivity between the adjacent forests. The research provided an exemplar of how modelling can help managers objectively balance the differing pressures of multipurpose forestry. The model was further enhanced by comparing results to field data from Anglesey, Wales in which grey squirrels were removed and red squirrels reintroduced (3.5). This allowed the model to include a validated representation of grey squirrel population control and results could be interpreted in terms of ‘trapping effort hours’ which is the practical resource unit used for grey squirrel control management. This development led to commissioned research to assess the level of control required to protect red squirrels in key locations in the UK.
3. References to the research
(3.1): Tompkins, D. M., White, A. R. & Boots, M. (2003). Ecological replacement of native red squirrels by invasive greys driven by disease. Ecology Letters. 6: 189-196. (doi.org/10.1046/j.1461-0248.2003.00417.x, Journal Impact factor (JIF) - 8.7).
(3.2): White, A., Lurz, P. W. W., Bryce, J., Tonkin, M., Ramoo, K., Bamforth, L., Jarrott, A. & Boots, M. 2016. Modelling disease spread in real landscapes: Squirrelpox spread in Southern Scotland as a case study. Hystrix, the Italian Journal of Mammalogy 27: 1. (doi.org/10.4404/hystrix-27.1-11657, JIF - 1.5).
(3.3): White, A., Bell, S.S., Lurz, P.W.W. and Boots, M. 2014. Conservation management within strongholds in the face of disease-mediated invasions: red and grey squirrels as a case study. Journal of Applied Ecology. 51: 1631-1642. (doi.org/10.1111/1365-2664.12274, JIF - 4.6).
(3.4): Jones, H. E. M., White, A., Geddes, N., Clavey, P., Farries, J., Dearnley, T., Boots, M. & Lurz, P. W. W. 2016. Modelling the impact of forest design plans on an endangered mammal species: the Eurasian red squirrel. Hystrix, the Italian Journal of Mammalogy 27: 1. (doi.org/10.4404/hystrix-27.1-11673, JIF - 1.5).
(3.5): Jones, H.E.M, White, A., Lurz, P.W.W., & Shuttleworth, C.M. 2017. Mathematical models for invasive species management: Grey squirrel control on Anglesey. Ecological Modelling. 359: 276-284. (doi.org/10.1016/j.ecolmodel.2017.05.020, JIF - 2.5).
4. Details of the impact
Impact on Red Squirrel Conservation Policy in Scotland
The mathematical research in (3.2,3.3) led to commissioned reports [5.1,5.2] undertaken by Prof White for Scottish Natural Heritage and the Forestry Commission Scotland which delivered management advice for the protection of red squirrel populations exposed to squirrelpox virus. Prior to this work, conservation policy had focussed on control in regions where grey squirrels tested positive for squirrelpox. This had reduced grey numbers in local areas but had not prevented disease spread or increased red squirrel numbers. The work showed that it is extremely challenging to prevent the spread of the squirrelpox into areas where grey squirrels are already well-established and, importantly, that squirrelpox will not persist in red squirrel (only) populations. Saving Scotland’s Red Squirrels (SSRS), who are responsible for red squirrel conservation in Scotland, state [5.3] that Prof White’s work “ had impact both on policy through inclusion in the Scottish Strategy for Red Squirrel Conservation (2015) [5.4] and practice as SSRS recognised the difficulty in containing squirrelpox as the trigger for a radical change of strategy that protects remaining red squirrel populations in Priority Areas for Red Squirrel Conservation (PARCs) in southern Scotland”. Furthermore, SSRS indicate that our “ mathematical modelling work adds a scientific underpinning and rationale for the use of grey squirrel control to protect red squirrel population” which is coordinated by local communities in and around PARCs and has led to an increase in red squirrel distribution and abundance [5.3]. The modelling work continues to direct red squirrel conservation policy in Scotland through commissioned research undertaken by Prof White for Saving Scotland’s Red Squirrels in 2017 [5.5] to determine the level and location of grey squirrel control to prevent further spatial expansion of the grey squirrel distribution beyond the Highland protection line in Scotland and for the Forestry Commission Scotland in 2018 [5.6] to examine the level of grey control required to protect red squirrels in Southern Scotland. The modelling research (3.2,3.3),[5.1,5.2,5.8] was promoted by stakeholders and led to extensive coverage on BBC radio, BBC online and in national newspapers (The Times, The Herald, The Scotsman) [5.7].
Impact on Forest Management Practice in Northern England
Prof White’s modelling research (3.2,3.3) was extended to answer specific red squirrel conservation issues faced by forest management practitioners. Working with Stakeholders in the Forestry Commission (FC), future felling and restocking scenarios for Kidland and Uswayford forest (2962 ha of forest in north-east England) were used to drive the mathematical model and led to predictions of red squirrel population viability (3.4). Quoting from the Planning and Environment Manager for the North England Forest District of the Forestry Commission [5.8] “ *The findings in the report (3.4) highlighted how the (original) proposed forest design plans could lead to a risk of red squirrel extinction (particularly in Uswayford). As a result alternate felling and restocking regimes were developed by FC and tested in the model. The model results had a direct impact on the final forest design plans for the Uswayford and Kidland forest (for 2017 - 2041) which are outlined in the Cheviot Forest Plan (FC 2015) and which represent the optimum in terms of silviculture and red squirrel viability. The combined FC and Maxwell Institute modelling study therefore played a key role in the red squirrel conservation strategy in these forests.*”
Impact on Grey Squirrel Control Strategy on Anglesey
Prof White further developed the model to include a realistic representation of grey squirrel control effort by fitting the model to field observations for the removal of grey squirrels and reintroduction of red squirrels on the Isle of Anglesey, Wales (3.5). The model showed that the most effective strategy to remove grey squirrels would focus the initial trapping effort on high density regions and then spread the trapping effort to all regions where greys are present. The model also showed that the best strategy to prevent grey squirrel re-invasion of Anglesey would focus trapping efforts on the mainland side of the Britannia bridge and use high levels of targeted trapping on Anglesey when grey squirrel sightings are reported. The research findings were adopted by the Red Squirrel Survival Trust (RSST) in 2017 as the strategy to protect red squirrels on Anglesey who state [5.9] that “ the modelling work continues to be used to determine the optimal strategy to prevent the re-invasion of Anglesey by grey squirrels and has therefore had an impact on the RSST conservation practice to protect red squirrels on Anglesey – which are still thriving. The modelling work also played a key role in providing evidence-based research to convince the public of the ongoing need to control grey squirrels”. The applicability of our mathematical research is emphasised by Dr Craig Shuttleworth of the Red Squirrel Survival Trust who states [5.9] that “ the collaboration between ecologists, conservation practitioners and mathematical modelling provided a template for how interdisciplinary research can provide solutions to real world problems for the protection of endangered species and therefore the maintenance of biodiversity in the UK.”
Impact on Forest Management Practice in Scotland
Forest and Land Scotland (formally the Forestry Commission Scotland) commissioned Prof White to adapt the model (3.4,3.5) to determine the viability of 19 designated red squirrel strongholds in Scotland [5.10,5.11]. The model compared red squirrel viability under stronghold forest management (SHM) plans (where tree species composition is altered to discourage grey squirrels) and UK forest standard (UKFS) management plans. The research showed that some designated strongholds were unsuitable for red squirrel protection, that other regions may provide natural strongholds (where reds are protected under UKFS plans) and that north of the Highland red squirrel protection line there was no benefit to red squirrel abundance under SHM compared to UKFS. Forestry and Land Scotland state [5.12] that “ reconciling the SHM policy with other management objectives, fluctuating timber markets and wind-blow events, requires significant additional management input and can affect income from timber sales. Furthermore, the single-species focus of the SHM policy has consequences for other environmental work that aims to increase ecosystem diversity. The results of the research by Professor White are therefore important for Forestry and Land Scotland and will have a direct impact on management across large parts of the national forest estate. The research findings will allow a future focus on natural strongholds under the UKFS approach and this will afford more flexibility to conserve red squirrel populations whilst simultaneously delivering other forest management objectives.”
5. Sources to corroborate the impact
[5.1] White, A. and Lurz, P.W.W. 2014. A modelling assessment of control strategies to prevent/reduce Squirrelpox spread. Scottish Natural Heritage Commissioned Report No. 627 (see summary pp i-ii). https://www.nls.uk/e-monographs/2014/627.pdf
[5.2] Lurz, P.W.W., White, A., Meredith, A., McInnes, C. and Boots, M. 2015. Living with pox project: Forest management for areas affected by squirrelpox virus. Forestry Commission Scotland Commissioned Report (see Exec. Summary, pp 3-6). http://www.macs.hw.ac.uk/~awhite/LivingWithPoxReport.pdf
[5.3] Letter of support from the Project Manager for Saving Scotland’s Red Squirrels and the Manager of the Wildlife Management Team for Scottish Natural Heritage.
[5.4] Scottish Strategy for Red Squirrels. 2015. Scottish Squirrel Group (see Section 2.2) https://www.nature.scot/scottish-strategy-red-squirrel-conservation-june-2015
[5.5] White, A., Lurz, P.W.W and Boots, M. 2017. Grey squirrel control along the highland line. Report for Scottish Natural Heritage and the Scottish Wildlife Trust (see Exec Summary, p 3). http://www.macs.hw.ac.uk/~awhite/SWT_SNH_HighlandLine_FinalReport.pdf
[5.6] White, A. and Lurz, P.W.W 2018. Grey squirrel control in southern Scotland: A model analysis. Forestry Commission Scotland Commissioned Report (see Exec. Summary, p 3). http://www.macs.hw.ac.uk/~awhite/FES_SouthScotlandReport.pdf
[5.7] Media coverage. Weblinks at http://www.macs.hw.ac.uk/~awhite/squirrels/media.html
[5.8] Letter of support from the Planning and Environment Manager for the North England Forest District, Forestry Commission.
[5.9] Letter of support from the Conservation Advisor to the Red Squirrel Survival Trust.
[5.10] Slade, A. White, A. and Lurz, P.W.W 2019. Evaluation of Red Squirrel Stronghold Forest Management. Forest and Land Scotland Commissioned Report (see Exec. Summary, pp 1-2). http://www.macs.hw.ac.uk/~awhite/FLS_Stronghold_Evaluation_Report_Part1.pdf
[5.11] Slade, A. White, A. and Lurz, P.W.W 2019. Identification of Natural Red Squirrel Strongholds. Forest and Land Scotland Commissioned Report (see Exec. Summary, pp 1-2). http://www.macs.hw.ac.uk/~awhite/FLS_Stronghold_Evaluation_Report_Part2.pdf
[5.12] Letter of support from the Wildlife Ecologist for Forest and Land Scotland.
- Submitting institution
- Heriot-Watt University, University of Edinburgh (joint submission)
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Technological
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
Methods for the simultaneous estimation of epidemic dynamics and pathogen evolution developed by researchers at Heriot Watt University were incorporated into a user-friendly computer package BORIS by scientists from the University of Melbourne. From July 2018, the BORIS package has been used as an analytic tool within the New Zealand Ministry of Primary Industries (MPI) eradication programme for Mycoplasma Bovis, a bacterial disease that affects dairy and beef cattle, estimated to cost a total of NZD886,000,000. Specifically, BORIS has been used to identify potential times and sources for observed infections so that risk factors for transmission, or potential failures in biosecurity can be identified. The eradication programme has been highly effective with only 4 premises out of more than 20,000 having active disease in July 2020 and infection having been successfully cleared from 246 premises by that time.
2. Underpinning research
The research leading to the impacts described concerns methods of inference for individual-based spatio-temporal models of epidemics and, in particular, methods that enable the estimation of unobserved transmission graphs (who infected whom) and times of infection events. Such insights can be key to understanding the factors underlying disease transmission when controlling emerging epidemics.
Building on their earlier work on fitting and testing spatio-temporal models for partially observed epidemics [3.1] Gibson, Streftaris and Marion (Biomathematics and Statistics Scotland), with PhD student Lau, tackled the problem of simultaneously inferring pathogen evolution and spatio-temporal epidemic transmission in settings where genetic sequence data on pathogens are available in addition to case data - a major current challenge in epidemic modelling. Motivated by pathogens such as foot and mouth disease (FMD), they extended the spatio-temporal Susceptible-Exposed-Infective-Removed (SEIR) model for epidemic transmission with a 2-parameter Kimura model for pathogen evolution and introduced a novel Bayesian method for fitting the extended model to partial observations of an emerging epidemic [3.2]. By explicitly representing sequences transmitted during infection events as latent variables, and imputing sequences transmitted to unsequenced infections, they were able to dispense with the requirement that pathogen sequences be available for all infected host units - where a unit in the context of diseases such as FMD is typically a farm – making them applicable in real-world settings. Moreover, the methods can accommodate multiple introductions of an epidemic into a population, allowing imputation of a general infection graph rather than the infection tree implied by a single introduction. These enhancements were achieved using data-augmentation methods coupled with Markov chain sampling approaches. An important innovation in the method was the formulation of efficient proposal distributions for jointly updating unobserved times of, or sources for, infection events and the genetic sequence transmitted during the corresponding event. A key finding of [3.2] was that the methods were capable of identifying infection sources with high accuracy when epidemic data were complemented by even modest amounts of genomic data on pathogens.
The methods of [3.2] were assessed in comparison with the main competing approaches in an independent study by a team led from the Division of Veterinary and Agricultural Sciences at the University of Melbourne and were found to be the most robust for reconstructing transmission trees from partial observations of disease incidence and pathogen genomics [5.1, 5.2]. The Melbourne-led team (with input from Lau) subsequently incorporated the algorithm presented in [3.2] – enhanced with the capacity to include contact data and farm-level covariates - into a user-friendly computer package [5.3] facilitating its use by groups working on the control of ongoing epidemics. Of particular relevance to this case study is the application of the methods to Mycoplasma bovis in New Zealand.
3. References to the research
[3.1] Lau, M. S. Y., Marion, G. & Streftaris, G., Gibson, G. J. (2014) New model diagnostics for patio-temporal systems in epidemiology and ecology, J. R. Soc. Interface.11: 20131093.
[3.2] Lau, M. S. Y., Marion, G. & Streftaris, G., Gibson, G. J. (2015) A Systematic Bayesian Integration of Epidemiological and Genetic Data, PLOS Computational Biology 11, 11, 27 p., e1004633L. https://doi.org/10.1371/journal.pcbi.1004633
4. Details of the impact
Mycoplasma bovid is a bacterial disease that affects dairy and beef cattle, causing severe illness with major impact on production. It is a disease of major significance to the New Zealand farming industry, which represents the second largest export market for the country. The New Zealand epidemic was first detected in 2017, following which a major eradication programme was announced in May 2018 [5.4]. The total cost of the programme was estimated to be NZD886,000,000 (05-2018) [around GBP470,000,000], including NZD16,000,000 in lost production and NZD870,000,000 of response costs needed to fight the cattle disease over a 10-year period [5.4]. According to [5.4], had the epidemic been allowed to proceed unchecked the cost would have been approximately NZD1,300,000,000 (05-2018) [around GBP689,000,000] over 10 years, with ongoing productivity losses across the New Zealand farming sector.
As part of a wider programme of epidemiological work that included support (approximately AUD70,000) from New Zealand’s Ministry of Primary Industries (MPI), Firestone and colleagues developed a computer package BORIS (Bayesian Outbreak Reconstruction Inference and Simulation) [5.2, 5.3] which incorporated the methods of [3.2] with the additional capacity to accommodate contact data and farm-level covariates. According to the team who developed BORIS, “The main benefits provided by your algorithm derive from its capacity to fit epidemic dynamics and pathogen phylodynamics within an integrated framework, to represent multiple introductions, and to cope with partial sampling scenarios – essentially allowing the state of unobserved nodes in a network to be inferred” [5.5].
The BORIS package, code for which is now freely available from GitHub [5.3], has subsequently been applied within the M bovis eradication programme since July 2018 [5.2, 5.6] as one of a suite of analytic tools, to analyse epidemiological and genomic data in order to inform the government of the timing of likely introduction of M bovis into New Zealand, whether there were multiple introductions, and the extent to which the case network could be determined. Specifically, BORIS is used to infer timing of infections, infectious periods for farms, and to identify likely transmission routes (who-infected-whom) for the ongoing epidemic. Such insights serve to identify possible infector-infectee links, and corresponding time periods, that should be scrutinised in order to understand the potential risk factors, or failures in biosecurity, that may lead to transmission of the disease. Unlike competing genomic methods, BORIS has the capacity to impute chains of infection that include farms for which genetic data on the pathogen may not be available. Results of these analyses are key to providing confidence to MPI of their understanding of the outbreak [5.5, 5.6]. Since the eradication programme was initiated, the epidemic has been successfully controlled to the extent where, as of July 2020 (resp. November 2020), there were only 4 (resp. 6) properties (out of more than 20,000 dairy and beef farms) [5.5] in New Zealand where the disease was known to be active, having been successfully cleared from 246 premises by that time. BORIS continues to be used as a decision-support tool for guiding surveillance teams within the programme as it moves towards eradication of the disease [5.6].
More widely, the techniques of [3.2] have been promoted through a training programme delivered by Firestone on the use of the BORIS package to Australian government veterinarians at the Department of Agriculture, Water and the Environment (Canberra, November 2019). To date 20 scientists have received training in its use [5.2]. BORIS has also helped improve the understanding of spread of foot-and-mouth disease in Japan [5.7], for example demonstrating the high transmissibility of the disease from farms holding predominantly pigs.
5. Sources to corroborate the impact
[5.1] Firestone, S. M., Hayama, Y., Bradhurst, R. et al. Reconstructing foot-and-mouth disease outbreaks: a methods comparison of transmission network models. Sci Rep 9, 4809 (2019). https://doi.org/10.1038/s41598-019-41103-6 Provides a critical assessment of methods of [3.2] in comparison to competing methods.
[5.2] Letter of support from Senior Lecturer, Faculty of Veterinary and Agricultural Sciences, University of Melbourne. Corroborates details relating to the implementation of the techniques of [3.2] within the BORIS package, the benefits they bring, how they have been applied to M bovis and FMD, and training given in the use of the methods.
[5.3] Firestone S. M., Lau M. S. Y., Kodikara S., Demirhan H., Hayama Y., Yamamoto T., et al. (2019) BORIS: R package for Bayesian Outbreak Reconstruction Inference and Simulation. GitHub repository, https://github.com/sfires/BORIS
[5.4] Press release, New Zealand Government, 28/5/2018, “Plan to eradicate Mycoplasma bovis”, https://www.beehive.govt.nz/release/plan-eradicate-mycoplasma-bovis
Announcement of M bovis eradication plan including estimated costs and scale given in Section 4.
[5.5] Press release, New Zealand Government, 22/7/2020, “ M. bovis eradication makes gains three years on from detection”, https://www.beehive.govt.nz/release/mbovis-eradication-makes-gains-three-years-detection.
Provides details of status of epidemic in July 2020 including number of properties where M bovis was active.
[5.6] Letter of support from Chief Departmental Science Advisor, Ministry for Primary Industries (MPI), New Zealand. Corroborates use of BORIS within the ongoing M Bovis eradication programme and effectiveness of the programme.
[5.7] Firestone SM, Hayama Y, Lau MSY, Yamamoto T, Nishi T, Bradhurst RA, et al. (2020) Transmission network reconstruction for foot-and mouth disease outbreaks incorporating farm-level covariates. PLoS ONE 15(7): e0235660. https://doi.org/10.1371/journal.pone.0235660
Describes use of BORIS package for the analysis for reconstructing transmission networks for foot and mouth disease in Japan.
- Submitting institution
- Heriot-Watt University, University of Edinburgh (joint submission)
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Societal
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
STACK is an online assessment software for mathematics and other STEM disciplines. At December 2020, STACK is used across UK HE within over 1300 registered learning management systems, including the Open University. Translated into 20 languages, STACK is used by over 30 universities in Germany, by every university in Finland, and in Japan, Israel, Kenya and the USA. STACK is used for international development and commercial school textbooks. Through all of these instances, it has improved the learning of hundreds of thousands students, it has supported institutions to pivot to online learning during the COVID-19 pandemic, and it saves tens of thousands of hours of human marking annually. The underpinning research has directly influenced the development of many similar/competitor products.
2. Underpinning research
Assessment and feedback are essential for effective learning. Bloom's work on "Mastery learning" in the 1980s found that the gap between tutors and traditional large-group teaching could be closed by regular testing with carefully designed mastery tests. Using online assessment provides an opportunity to implement Bloom's ideas, however, online assessment has predominantly relied on multiple choice tests. Multiple choice questions are particularly problematic in mathematics because, in addition to elimination and guessing, students can reverse-engineer many questions. Research with STACK found that when faced with a reversible mathematical process, students solve a multiple-choice version by verifying the answers presented to them by the direct method, not by undertaking the actual inverse calculation, see [3.1].
Development of STACK is based on the observation that authentic and valid assessment requires the student to provide an answer which contains substantial content, rather than using a multiple choice (or similar) question. Students must input mathematical expressions, and complete mathematical arguments; the software establishes objective properties of those arguments and provides outcomes including feedback and statistics. This technology is essential in developing fully online courses, thereby changing teaching structures, which move away from large groups to enable students to progress at an individual pace.
Professor Sangwin initiated the STACK project and platform from scratch, and subsequently worked in partnership with other institutions such as the Open University and Loughborough University to develop different aspects of STACK. His leadership has enabled the research from other universities to have a global reach. Since 2015, The University of Edinburgh has been the home of the STACK project. We hosted the second international STACK conference in April 2019.
STACK focuses on using computer algebra to provide teachers with very carefully designed tools to enable online assessment. Key research at The Maxwell Institute includes [3.2] which expanded the computer algebra features to include dimensional numerical quantities, based on SI and fine-tuned to the needs of assessments. Further work carried out at the Maxwell Institute [3.3] has expanded the functionality to include algebraic line-by-line reasoning. Reasoning by equivalence is where an equation is manipulated to generate a new and equivalent equation, typically ending when the equation is solved. Professor Sangwin’s research [3.4] found that approximately a third of the method marks are awarded for reasoning by equivalence in final high school mathematics examination questions. There are many other forms of reasoning but reasoning by equivalence forms a stepping stone to advanced proof [3.5]. In STACK a student's whole argument then becomes a single mathematical object (just as an equation can be treated as a single object) which is to be subjected to formal verification. The research reported in [3.2, 3.3, 3.4] led directly to significantly extended functionality.
3. References to the research
[3.1] C. J. Sangwin and I. Jones. Asymmetry in student achievement on multiple choice and constructed response items in reversible mathematics processes. Educational Studies in Mathematics, 94:205–222, 2016. https://doi.org/10.1007/s10649-016-9725-4
[3.2] C. J. Sangwin and M. Harjula. Online assessment of dimensional numerical answers using STACK in science. European Journal of Physics, 38, 2017. https://doi.org/10.1088/1361-6404/aa5e9d
[3.3] C. J. Sangwin. Reasoning by Equivalence: The Potential Contribution of an Automatic Proof Checker. In: Hanna G., Reid D., de Villiers M. (eds) Proof Technology in Mathematics Research and Teaching. Mathematics Education in the Digital Era, vol 14. Springer, Cham. 2019. https://doi.org/10.1007/978-3-030-28483-1_15
[3.4] C. J. Sangwin and N. Kocher. Automation of mathematics examinations. Computers and Education, 94:215–227, 2015. https://doi.org/10.1016/j.compedu.2015.11.014
[3.5] C. J. Sangwin and Bickerton, R. Practical Online Assessment of Mathematical Proof. Accepted, International Journal of Mathematical Education in Science and Technology, (2020) http://arxiv.org/abs/2006.01581
4. Details of the impact
STACK is a contemporary assessment software for mathematics. It is installed on the learning management system of over 1300 registered websites with groups of up to 1500 students [5.1]. Extrapolating, we estimate 1.2 million students worldwide are benefitting from this methodology. Having been translated into 20 languages, STACK is used in over 30 universities in Germany, every university in Finland and many other countries including Japan, Israel, Kenya, and the USA [5.2, 5.3, 5.4]. The abacus consortium, established to share STACK and other mathematics assessments, currently has 38 partners [5.5], further extending the reach of STACK.
In comparison with other online assessment software, STACK offers unique additional functionality that changes the way that both learners and teachers engage with learning and assessments. Students enter mathematical expressions and STACK will assess the properties of these expressions; teachers author their own questions; STACK can generate questions with random variables, to reduce copying of answers; STACK allows teachers to give partial marks and tailored feedback depending on the different mathematical properties of the students’ answers. STACK supports multipart questions, enabling teachers to write step-by-step questions; STACK automates the “question testing” process enabling robust questions with long-term support. This means that there is almost double the assessment per student using half the staff time, increasing the efficiency of staff while improving student learning [5.3]. The research described in [3.4] has led directly to uptake for physics textbooks by the commercial textbook publisher, Physics Curriculum & Instruction, and more generally wider use in STEM subjects. The publishers write “ Instructors using our system have stated that students have greater success with solving physics problems, and gain deeper understanding into problem-solving strategies, when compared to traditional homework environments of working alone without computer aided assistance” [5.6].
Combining online assessment with human marking saves institutions thousands of hours of work each year: the University of Edinburgh estimates the School of Mathematics saves over 8,300 hours of work annually, through 7,500 accounts, equating to over GBP200,000 of savings year on year. This is replicated for STACK users across the world, for instance “STACK is used for both summative and formative assessment. Over 1.2 million STACK questions were answered by Open University students in the 2019/20 academic year. This saves a significant amount of staff time each year. Marking this volume of questions, even if only taking 5 seconds per question would take approximately 1 year of staff time each year” [5.7].
As a result, STACK enables institutions to redeploy human resource and maintain quality despite growing student numbers. The University of Edinburgh consolidated online assessment in most year 1 and 2 mathematics and general science courses with the support from a dedicated learning technologist [5.8]. The University of Durham established a similar post in 2020 [5.9], and University College London appointed two people to 3-year posts to create STACK questions. This has changed the way courses in mathematics and science are delivered across these institutions. The Head of Department, Mathematics, UCL writes: “The impact of this change of assessment on our year 1 students, most of whom will end up studying the whole year remotely, is immense. They get much more detailed feedback than they would have had in a normal year; and this is vital when we're exactly at the time when the support structures (such as informal contact with tutors and peers) that would have formed the academic safety net have been damaged by the pandemic” [5.10].
The novel work described in [3.3] extends assessment capabilities to entire mathematical arguments in a well-defined class by assessment of students’ working itself. Feedback is provided to increase students’ understanding of mathematics, improve students’ experience, and raise their competence. The Open University comments “ Whilst it is difficult to isolate the effect of STACK ….. modules within Mathematics and Statistics enjoy some of the highest student satisfaction ratings within the University. The use of STACK questions within modules is often praised by students in unsolicited comments within end-of-module feedback” [5.7].
In 2020, to mitigate the effects of COVID-19 on teaching, the Maxwell Institute ran 12 practical workshops on question authoring, with over 350 academic staff attendees from across the world, and presented 15 online talks about STACK, the underlying research, and how to use this technology in teaching [5.11]. These “ seminars and conference talks have guided and challenged colleagues’ thoughts on the nature of assessment and how to assess different mathematical topics, particularly for distance-learning students. The most recent example of this was Chris Sangwin’s online workshop on the assessment of mathematical proof, given in September 2020” [5.7]. As a further response to COVID-19, STACK was used in December 2020 to replace university examinations, in Edinburgh reducing the marking time in one module alone from 35 person days to 22 person days with a combination of automatically marked STACK assessments and human marking of more complex proofs.
In addition, the research underpinning STACK has influenced the design of other successful online assessment systems, particularly the open source NUMBAS system and “Learning Algebra” from Haese Mathematics, [5.12].
5. Sources to corroborate the impact
[5.1] Data on number of sites using STACK (increasing from 134 in August 2013 to 1352 in December 2020), number of languages into which STACK has been translated, and numbers of student users https://moodle.org/plugins/stats.php?plugin=qtype_stack.
[5.2] 11 published case-studies of worldwide STACK use – https://www.maths.ed.ac.uk/~csangwin/stack/2019-cate-case-studies.pdf or https://stack-assessment.org/CaseStudies/.
[5.3] An overview of the STACK assessment system can be found at https://stack-assessment.org/. The STACK demonstration site can be viewed at https://stack-demo.maths.ed.ac.uk/demo/ and the underlying source code at https://github.com/maths.
[5.4] Letter from Director, IDEMS International Community Interest Company.
[5.5] Collective of 35 international universities devoted to using STACK in teaching https://abacus.aalto.fi and letter of support from the Abacus coordinator.
[5.6] Commercial use in School Physics textbooks, e.g. https://stack-assessment.org/CaseStudies/2019/PhysicsCurriculum/ and letter of support from the publisher, Physics Curriculum and Instruction.
[5.7] Letter from Director of Teaching, at the School of Mathematics and Statistics and the Head of School of Physical Sciences at the Open University outlining the centrality of STACK in their mathematics provision.
[5.8] C. J. Sangwin and K. Zerva. Developing online learning materials to support undergraduate education at the University of Edinburgh. Mathematics Today, 212-215, 2020.
[5.9] Letter from Head of Department of Mathematical Sciences, Durham University detailing their use of STACK for e-assessment.
[5.10] Letter from Head of Department of Mathematics, UCL, indicating the impact of STACK in teaching, particularly during the pandemic.
[5.11] Agenda from meetings and conferences (e.g. international STACK users conference, Germany, Nov 2018).
[5.12] Letter from Haese Mathematics publishers describing the significant influence published research about STACK has had on the design of their proprietary “Learning Algebra” software.
- Submitting institution
- Heriot-Watt University, University of Edinburgh (joint submission)
- Unit of assessment
- 10 - Mathematical Sciences
- Summary impact type
- Economic
- Is this case study continued from a case study submitted in 2014?
- No
1. Summary of the impact
The research was carried out with Aberdeen Standard Investments (ASI), for the design and implementation of new diversification algorithms, within a new innovative framework called Dimensionality, which creates multi-asset portfolios with better performance under adverse market conditions. As a result, the Macro Systematic Dimensions Fund was launched in July 2019, and had USD33,608,897 under management as at 31 December 2019 [5.1]. According to ASI, funds using Dimensionality performed better in the financial turmoil of early 2020 than those using other common approaches, such as Risk Parity or Minimum Variance [5.4]. Direct beneficiaries of this work include ASI with the launch of a new product (fund) and their clients.
With ASI, the investment arm of the UK’s second largest asset management firm, Standard Life Aberdeen, rolling out this product while simultaneously expanding education among institutional investors, [5.6], the UK’s asset management sector has increased its global resilience to economic shock.
2. Underpinning research
The global financial crisis of 2008 led to heavy losses for most asset portfolios held by institutional investors, prompting investors to question their portfolio construction methodologies and understanding of the level of diversification which can be achieved. A common belief for a well-diversified portfolio is that the risk of the portfolio should not be concentrated to only a few risk factors and the tail risk of the portfolio should be controlled.
The research identified a new framework for portfolio diversification which goes beyond the classical mean-variance approach and portfolio allocation strategies such as risk parity, see [3.1]. It is based on a novel concept, devised during the collaboration with ASI, called portfolio dimensionality which connects diversification to the unpredictability and non-standard nature of portfolio returns and can typically be defined in terms of the ratio of risk measures which are homogenous functions of equal degree. Maximising portfolio dimensionality leads to highly non-trivial optimization problems with objective functions which are typically non-convex and potentially have multiple local optima. Two complementary global optimization algorithms were developed. See [3.1].
- For problems of moderate size/dimension, a deterministic Branch and Bound algorithm was developed. This algorithm is the work-horse of discrete optimization, although rarely used for solving continuous optimization problems. One of the reasons is that the efficiency of the algorithm crucially depends on the derivation of tight upper and lower bounds on the objective function over subsets of the solution space, which are often difficult to obtain for non-convex problems, as well as a suitable approach of subdividing the solution space. As our initial solution space is given as a simplex, the use of simplicial decomposition is chosen for the latter. Concerning the former, new continuous and combinatorial bounds were developed for the problem which turned out to be considerably tighter than the ones previously obtained and resulted in a significant reduction in the number of iterations that were needed to solve the problem to optimality (within a given error bound).
For problems of larger size/dimension, a stochastic global optimization algorithm based on Gradient Langevin Dynamics was developed. This relies on recent state-of-the-art work on Stochastic Gradient Langevin Dynamics [3.2] and on variants of the Unadjusted Langevin Algorithms [3.3] within the framework of nonconvex stochastic optimization. The theoretical underpinnings of these algorithms rely on the analysis of the Langevin stochastic differential equation and the properties of its numerical schemes, see [3.4].
3. References to the research
[3.1] Barkhagen, M, Fleming, B, Quiles, SG, Gondzio, J, Kalcsics, J, Kroeske, J, Sabanis, S & Staal, A 2019 'Optimising portfolio diversification and dimensionality' ArXiv. https://arxiv.org/abs/1906.00920
[3.2] Chau, NH, Moulines, É, Rásonyi, M, Sabanis, S & Zhang, Y 2019 'On stochastic gradient Langevin dynamics with dependent data streams: the fully non-convex case' ArXiv. https://arxiv.org/abs/1905.13142
[3.3] Brosse, N, Durmus, A, Moulines, É & Sabanis, S 2018, 'The Tamed Unadjusted Langevin Algorithm', Stochastic Processes and their Applications. Volume 129, Issue 10, October 2019, Pages 3638-3663 https://doi.org/10.1016/j.spa.2018.10.002
[3.4] Sabanis, S 2016, 'Euler approximations with varying coefficients: the case of superlinearly growing diffusion coefficients', Annals of Applied Probability, vol. 26, no. 4, pp. 2083-2105. https://doi.org/10.1214/15-AAP1140
4. Details of the impact
Standard Life Aberdeen PLC is a multinational investment company. It operates in asset management through its subsidiary ASI with equities, multi-asset, fixed income, real estate, and private market funds. Based on the outcomes of this research, a fund was launched in November 2019 by Aberdeen Standard Investments (ASI) under the name 'Standard Life Investments Global SICAV – Macro Systematic Dimensions Fund (Bloomberg ticker: SLMSDDU LX Equity)'. The fund was developed as a result of the joint research effort of ASI and the Maxwell Institute (MI) team, and was co-funded by ASI. The Macro Systematic Dimensions Fund was launched in July 2019, and had USD33,608,897 under management as at 31 December 2019 [5.1].
The holdings of the Fund are not selected with reference to a benchmark index, but the performance of the Fund is compared with the Secured Overnight Financing Rate ("SOFR") [5.2].
The investment team applies a systematic approach to portfolio construction. Their primary focus is to use evidence-based, data-driven quantitative approaches to identify investment ideas within four categories (i.e. equity and credit, interest and inflation rates, relative value and others, e.g. the volatility of investment markets). Such portfolios receive substantial investment from institutional players, including large pension funds, as a way for them to achieve a healthy growth in their balance sheets. Their objective to invest in well-diversified (and thus resilient) portfolios, which can cope in times of financial turbulence, is of utmost priority for them. According to the Head of Macro Systematic Strategies Research at ASI “ The Multi-Asset Solutions team within Aberdeen Standard Investments considers this methodology to be a unique selling point, both as a diagnostic tool as well as a differentiated way to think about portfolio construction and diversification. The feedback from our clients has been very positive” [5.3]. The importance of the aforementioned methodology as described above highlights the significance of this research for the company and its clients.
Moreover, and according to ASI, the research has helped the company to develop an investment philosophy which underpins other products (funds/segregated mandates/sub-exposures in existing funds) and, importantly, to differentiate from the competition. ASI confirmed “ The experience in early 2020 when financial markets experienced some of the most extreme and unusual behaviour since the great depression and the financial crisis of 2008, has re-enforced the interest in risk management, portfolio diversification and risk mitigation. For those funds that deployed the methodology developed during the collaborative research agreement, we have retrospectively compared the portfolio construction methodology with other common approaches, such as Risk Parity or Minimum Variance and found the Dimensionality approach to be superior” [5.3].
The joint research article [3.1] was co-presented (by ASI and Sabanis) at one of the most prestigious conferences for industry and academia in Financial Mathematics, namely the SIAM Conference on Financial Mathematics & Engineering (FM19) which took place in Toronto [5.4]. The joint research efforts were highlighted by ASI at the CBOE (Chicago Board Options Exchange) Risk Management Conference which took place in Munich [5.5].
The fund was awarded the Fund Launch of The Year at the 5th annual EQDerivatives Awards (the event was postponed due to the current global health emergency). According to the award committee, “ Not only is The Macro Systematic Dimensions Fund unique compared to peers when you take into consideration the innovative portfolio construction methodology and strategy, it is also extremely relevant now as global institutional investors we have spoken to are increasingly seeking systematic strategies that integrate macro views given the backdrop of an evolving market landscape. What was also a key differentiating factor has been your own contribution to expanding education among institutional investors across the globe, not only in a practical context, but in academia also. What has also been very positive is the feedback we have received on the 'Dimensionality' methodology - it truly is unique and how it taps into four unconnected return streams” [5.6]. Thus, the significance of this research is recognised by independent experts both in academia and also in industry.
5. Sources to corroborate the impact
[5.1] 2019 Standard Life Investments Global SICAV Audited Annual Report and Accounts, issued on 26 March 2020. Pages 276-285 refer to funds under management at 31 December 2019, https://www.aberdeenstandard.com/docs?editionId=3a93d563-06a6-4881-ba0c-5642a17fdc58
[5.2] ASI Key Investor Information for the Standard Life Investments Global SICAV Macro Systematic Dimensions Fund, https://www.aberdeenstandard.com/docs?editionId=c6ee80f7-a1c0-4204-9a4c-8a333d5f1e01
[5.3] Statement from Head of Macro Systematic Strategies Research at Aberdeen Standard Investments.
[5.4] UoE/ASI joint presentation https://meetings.siam.org/sess/dsp_programsess.cfm?SESSIONCODE=67304 at the SIAM Conference on Financial Mathematics & Engineering (FM19) https://www.siam.org/conferences/cm/conference/fm19
[5.5] ASI presentation (https://assets.website\-files.com/5ab57ef41738f46d15802683/5d7fa010bda9c63725d1459f\_Day%201%20Session%204%20Jens%20Kroeske\_final.pptx\)
at the Cboe (Chicago Board Options Exchange) Risk Management Conference, September 2019, https://www.cboermceurope.com/
[5.6] Email from EQ Derivatives to quantitative investment director within the multi-asset macro systematic strategy & risk team at ASI confirming that the fund was awarded the Fund Launch of The Year at the 5th annual EQDerivatives Awards, https://armanios.co.uk/dev/eqd/events/5th-annual-eqderivatives-awards