Simulation-based medical education (SBME) is often resource-intensive, yet rigorous evaluation of the costs and benefits of simulation programmes or activities is rare. Lin et al.1 raise the important issue of economic evaluation of SBME and offer guidance on its implementation by exploring challenges and opportunities. We offer support for discipline in the economic evaluation of simulation while urging caution in what and how we ascribe value as integral to the economic evaluation process.

Simulation educators mostly work in resource-constrained environments, which encourages reflection on the value of their work. Of course, we’re not the only ones to find this challenging. The concept of value has long been an area of interest to social scientists, economists and philosophers. A shared challenge is that value is contextual and varies within individuals and over time.

Maloney and Haines describe economic evaluation as a comparison of value, which can be paraphrased as an assessment of what is being obtained, what is given up to get it, and how it compares with the next best alternative.2 Lin et al.1 state that economic evaluation “must examine both costs and consequences of at least two alternatives”. This sounds reasonable, so why don’t we do it? Economic evaluations of educational programmes are rarely reported in the research literature. Key reasons may include their absence from professional-entry medical and health care programmes (unless they surface in public health-focused areas) and from graduate programmes in health professions education.

Economic evaluation is not a specified competence of clinical educators and nor is it included within professional frameworks for simulation practitioners.3 Reporting guidelines developed by Cheng et al.4 for health care simulation research do not include documenting costs (admittedly, only a tiny part of the economic evaluation). Finally, the skills required for economic evaluation are highly specialised. Maloney and Haines also explain that economic evaluations are framed quantitatively, whereas much health professions education research is qualitative.2 They offer additional reasons for limited uptake or dissemination of economic evaluation: diverse stakeholder involvement complicates evaluation; frameworks for economic evaluation in education are nascent, and simulation centre staff may feel threatened if economic evaluations are unfavourable.2

Although these explanations may be understandable, competing demands for health care resources, including education, mean that they may no longer be justifiable. A challenge for medical education is to achieve high-value and low-cost educational processes.2 Transparency of cost and value, alongside measures of effectiveness, are required for true accountability. Informed choices about educational processes can only be achieved by determining the actual cost alongside the value of each pedagogy, and by considering all stakeholders. This is especially important for publicly funded universities and other higher education and training organisations. The fact that there is a global consensus on the social accountability of medical schools,5 with professional association awards for its attainment,6 means that the identifying and measuring of cost and value are likely to take increasingly important roles despite their complexity.

Cost and value analyses have the potential to determine the feasibility and sustainability of SBME. There is a noted lack of methodological rigour and consistency in such analyses in health professions education.7-10 With the need (why) for economic evaluation in SBME established, we join Lin et al. in focusing on the detail (how) and consider the value of SBME as our departure point for discussion.

Firstly, we tend to measure what we can, rather than what we should.

Examples of tightly defined skills and outcome measures in SBME are reported and frequently cited as representing a rationale for the benefits of SBME, often with attendant economic evaluation.

This process is much harder with the combination of elements that contribute to the wider arc of safe, timely, accessible and efficient patient care. Placing greater emphasis on economic evaluation using methods that privilege quantitative measures increases the risk that these outcomes are pursued at the expense of others less quantifiable.

This brings us to the question of how we put a value on some of the outcomes.

Time-based targets and infection rates are easy to measure, but it’s difficult to put a dollar value on better communication with patients and colleagues, improved team culture or a calmer resuscitation room. Experienced clinicians know these have value, despite the lack of a “price tag”.

Downstream, longitudinal outcomes are even harder to measure; they include the passing of knowledge and skills on to others and the application of skills acquired in one context in many others.

Further, what are the costs (or values) of unintended outcomes?

Learning in complex environments has surfaced as an area of investigation for health professions educators. For example, Fenwick and Dahlgren thoughtfully précis complexity theory, acknowledging its many traditions and diverse perspectives: “…most would agree that complexity theory examines how living phenomena (learning, for example) emerge in a web of relations that form among things, including both social and material things, such as bodies, instruments, desires, politics, settings and protocols. Such things do not come together in a linear cause-effect trajectory, as so many aspects of our curricula seem to presume, nor are they ordered together through top-down authority. Instead, they become combined through myriads of non-linear interactions that continually present novel possibilities and exercise multiple causal influences on what emerges.”11 Hence, SBME intervention, or its cessation, may have multiple outcomes, and not all are expected.

Finally, we consider whose responsibility it is to do this work.

Although many stakeholders would benefit from economic evaluation work in SBME (programme directors, policymakers, curriculum designers), few of those involved in SBME will have the necessary skills. As Lin et al.1 propose, economic evaluation is probably best conducted as a collaborative effort between educators and health economists, and, we would suggest when an economic evaluation is undertaken to justify this effort. A worst-case scenario would arise if an economic evaluation were to cost more than its value to the quality of medical education.

We’ve touched on why not, why, how, what and who relative to economic evaluation. It is an exciting time to be working in SBME and thinking critically about our practice. Economic evaluation is an important avenue in which the key challenges lie in identifying and enumerating the value in evaluation.

This article was suggested by Antonio Scrocco, Infotech team, Italy and written and published by Nestel D, Brazil V & Hay M. “You can’t put a value on that… Or can you? Economic evaluation in simulation-based medical education”. Med Educ. 2018 Feb;52(2):139-141. DOI: 10.1111/medu.13505. PMID: 29356084.

References:

1 Lin Y, Cheng A, Hecker K, Grant V, Currie GR. Implementing economic evaluation in simulation-based medical education: challenges and opportunities. Med Educ 2018;52 (2):150–60.

2 Maloney S, Haines T. Issues of cost-benefit and cost-effectiveness for simulation in health professions education. Adv Simul 2016;1:13.

3 Society for Simulation in Healthcare. SSH certification programs. http:// ssih.org/certification. [Accessed 30 November 2017]

4 Cheng A, Kessler D, Mackinnon R et al. Reporting guidelines for health care simulation research: extensions to the CONSORT and STROBE statements. Adv Simul 2016;1:25.

5 Global Consensus for Social Accountability of Medical Schools. Global Consensus for Social Accountability of Medical Schools [homepage]. http://healthsocia laccountability.org/. [Accessed 21 October 2017.]

6 Association for Medical Education in Europe. ASPIRE-to-Excellence Awards. https://www.aspire-to-exce llence.org/. [Accessed 21 October 2017.]

7 Walsh K, Levin H, Jaye P, Gazzard J. Cost analyses approaches in medical education: there are no simple solutions. Med Educ 2013;47 (10):962–8.

8 Walsh K. Ethics of cost analyses in medical education. Perspect Med Educ 2013;2 (5–6):317–20.

9 Zendejas B, Wang AT, Brydges R, Hamstra SJ, Cook DA. Cost: the missing outcome in simulation-based medical education research: a systematic review. Surgery 2013;153 (2):160–76.

10 Brown CA, Belfield CR, Field SJ. Cost-effectiveness of continuing professional development in health care: a critical review of the evidence. BMJ 2002;324 (7338):652–5.

11 Fenwick T, Abrandt Dahlgren M. Towards socio-material approaches in simulation-based education: lessons from complexity theory. Med Educ 2015;49 (4):359–67.