Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Briggs, A.H.; Weinstein, M.C.; Fenwick, E.A.L.; Karnon, J.; Sculpher, M.J.; Paltiel, A.D. (2012)
Publisher: Sage
Languages: English
Types: Article
A model’s purpose is to inform medical decisions and health care resource allocation. Modelers employ quantitative methods to structure the clinical, epidemiological, and economic evidence base and gain qualitative insight to assist decision makers in making better decisions. From a policy perspective, the value of a model-based analysis lies not simply in its ability to generate a precise point estimate for a specific outcome but also in the systematic examination and responsible reporting of uncertainty surrounding this outcome and the ultimate decision being addressed. Different concepts relating to uncertainty in decision modeling are explored. Stochastic (first-order) uncertainty is distinguished from both parameter (second-order) uncertainty and from heterogeneity, with structural uncertainty relating to the model itself forming another level of uncertainty to consider. The article argues that the estimation of point estimates and uncertainty in parameters is part of a single process and explores the link between parameter uncertainty through to decision uncertainty and the relationship to value-of-information analysis. The article also makes extensive recommendations around the reporting of uncertainty, both in terms of deterministic sensitivity analysis techniques and probabilistic methods. Expected value of perfect information is argued to be the most appropriate presentational technique, alongside cost-effectiveness acceptability curves, for representing decision uncertainty from probabilistic analysis.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • 1. Caro JJ, Briggs AH, Siebert U, Kuntz KM. Modeling good research practices-overview: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-1. Med Decis Making. 2012;32(5):667-677.
    • 2. Roberts M, Russell LB, Paltiel AD, Chambers M, McEwan P, Krahn M. Conceptualizing a model: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-2. Med Decis Making. 2012;32(5):678-689.
    • 3. Siebert U, Alagoz O, Bayoumi AM, et al. State-transition modeling: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-3. Med Decis Making. 2012;32(5):690-700.
    • 4. Karnon J, Stahl J, Brennan A, Caro JJ, Mar J, Mo¨ ller J. Modeling using discrete event simulation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-4. Med Decis Making. 2012;32(5):701-711.
    • 5. Pitman R, Fisman D, Zaric GS, et al. Dynamic transmission modeling: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force Working Group-5. Med Decis Making. 2012;32(5):712-721.
    • 6. Eddy DM, Hollingworth W, Caro JJ, Tsevat J, McDonald KM, Wong JB. Model transparency and validation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-7. Med Decis Making. 2012;32(5):733-743.
    • 7. Claxton K. Exploring uncertainty in cost-effectiveness analysis. Pharmacoeconomics. 2008;26:781-98.
    • 8. Briggs A, Sculpher M, Buxton MJ. Uncertainty in the economic evaluation of health care technologies: the role of sensitivity analysis. Health Econ. 1994;3:95-104.
    • 9. Gold MR, Siegel JE, Russell LB, Weinstein MC. Cost-effectiveness in Health and Medicine. New York: Oxford University Press; 2007.
    • 10. Brisson M, Edmunds WJ. Impact of model, methodological, and parameter uncertainty in the economic analysis of vaccination programs. Med Decis Making. 2006;26:434-46.
    • 11. National Institute for Health and Clinical Excellence. Guide to the Methods of Technology Appraisal. London: NICE; 2008. Available from: URL: www.nice.org.uk/media/B52/A7/ TAMethodsGuideUpdatedJune2008.pdf. Accessed 1 December 2012.
    • 12. Lumley T. Network meta-analysis for indirect treatment comparisons. Stat Med. 2002;21:2313-24.
    • 13. Everitt BS. The Cambridge Dictionary of Statistics. Cambridge, UK: Cambridge University Press; 1998.
    • 14. Briggs AH, Sculpher MJ, Claxton K. Decision Modelling for Health Economic Evaluation. Oxford, UK: Oxford University Press; 2006.
    • 15. O'Hagan A, Buck CE, Daneshkhah A, et al. Uncertain Judgements: Eliciting Expert Probabilities. Chichester, UK: John Wiley and Sons; 2006.
    • 16. Lunn D, Spiegelhalter D, Thomas A, Best N. The BUGS project: evolution, critique and future directions. Stat Med. 2009;28: 3049-67.
    • 17. Ades AE, Cliffe S. Markov chain Monte Carlo estimation of a multiparameter decision model: consistency of evidence and the accurate assessment of uncertainty. Med Decis Making. 2002; 22:359-71.
    • 18. Karnon J, Czoski Murray C, Smith KJ, Brand C. A hybrid cohort individual sampling natural history model of age-related macular degeneration: assessing the cost-effectiveness of screening using probabilistic calibration, Med Decis Making. 2009;29:304-16.
    • 19. Karnon J, Vanni T. Calibrating models in economic evaluation: a comparison of alternative measures of goodness-of-fit, parameter search strategies, and convergence criteria. Pharmacoeconomics. 2011;29:51-62.
    • 20. Kim JJ, Kuntz KM, Stout NK, et al. Multiparameter calibration of a natural history model of cervical cancer. Am J Epidemiol. 2007; 166:137-50.
    • 21. Taylor DCA, Pawar V, Kruzikas DT, et al. Incorporating calibrated model parameters into sensitivity analyses deterministic and probabilistic approaches. Pharmacoeconomics. 2012;30:119-26.
    • 22. Vanni T, Karnon J, Madan J, et al. Calibrating models in economic evaluation: a seven step approach. Pharmacoeconomics. 2011;29:35-49.
    • 23. Bojke L, Claxton K, Sculpher M, Palmer S. Characterizing structural uncertainty in decision analytic models: a review and application of methods. Value Health. 2009;12:739-49.
    • 24. Chapman CH, Thompson SG, Sharples LD. Accounting for uncertainty in health economic decision models by using model averaging. J R Stat Soc Ser A Stat Soc. 2009;172(2):383-404.
    • 25. Jackson CH, Bojke L, Thompson SG, et al. A framework for addressing structural uncertainty in decision models. Med Decis Making. 2011;31:662-74.
    • 26. Briggs A, Fenn P. Confidence intervals or surfaces? Uncertainty on the cost-effectiveness plane. Health Econ. 1998;7:723-40.
    • 27. Stinnett AA, Mullahy J. The negative side of cost-effectiveness analysis (letter). JAMA. 1997;277:1931-2.
    • 28. Claxton K, Sculpher M, Drummond M. A rational framework for decision making by the National Institute for Clinical Excellence (NICE). Lancet. 2002;360:711-15.
    • 29. Brennan A, Kharroubi S, O'Hagan A, et al. Calculating partial expected value of perfect information via Monte Carlo sampling algorithms. Med Decis Making. 2007;27:448-70.
    • 30. Groot KoerkampB, Hunink MG, Stijnen T, Weinstein MC. Identifying key parameters in cost-effectiveness analysis using value of information: a comparison of methods. Health Econ. 2006;4:383-92.
    • 31. Ades AE, Lu G, Claxton K. Expected value of sample information calculations in medical decision modeling. Med Decis Making. 2004;24:207-27.
  • No related research data.
  • No similar publications.

Share - Bookmark

Download from

Cite this article