Skip to main content
Log in

Stated-Preference Survey Design and Testing in Health Applications

  • Practical Application
  • Published:
The Patient - Patient-Centered Outcomes Research Aims and scope Submit manuscript

Abstract

Following the conceptualization of a well-formulated and relevant research question, selection of an appropriate stated-preference method, and related methodological issues, researchers are tasked with developing a survey instrument. A major goal of designing a stated-preference survey for health applications is to elicit high-quality data that reflect thoughtful responses from well-informed respondents. Achieving this goal requires researchers to design engaging surveys that maximize response rates, minimize hypothetical bias, and collect all the necessary information needed to answer the research question. Designing such a survey requires researchers to make numerous interrelated decisions that build upon the decision context, selection of attributes, and experimental design. Such decisions include considering the setting(s) and study population in which the survey will be administered, the format and mode of administration, and types of contextual information to collect. Development of a survey is an interactive process in which feedback from respondents should be collected and documented through qualitative pre-test interviews and pilot testing. This paper describes important issues to consider across all major steps required to design and test a stated-choice survey to elicit patient preferences for health preference research.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. U.S. Department of Health and Human Services Food and Drug Administration, Center for Devices and Radiological Health and Center for Biologics Evaluation and Research. Patient Preference Information—Voluntary Submission, Review in Premarket Approval Applications, Humanitarian Device Exemption Applications, and De Novo Requests, and Inclusion in Decision Summaries and Device Labeling: Guidance for Industry, Food and Drug Administration Staff, and other Stakeholders. http://www.fda.gov/downloads/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm446680.pdf Accessed 24 Aug 2016.

  2. Medical Device Innovation Consortium. A framework for incorporating information on patient preferences regarding benefit and risk into regulatory assessments of new medical technology. 2015. http://mdic.org/wp-content/uploads/2015/05/MDIC_PCBR_Framework_Web1.pdf.

  3. Bridges JFP, de Bekker-Grob EW, Hauber AB, Heidenreich S, Janssen E, Bast A, Hanmer J, Danyliv A, Low E, Bouvy JC, Marshall DA. A roadmap for increasing the usefulness and impact of patient-preference studies in decision making in health: a good practices report of an ISPOR task force. Value Health. 2023;26(2):153–62.

    Article  PubMed  Google Scholar 

  4. Hauber AB, Marcos Gonzales J, Groothuis-Oudshoorn CGM, Prior T, Marshall DA, Cunningham C, IJzerman MJ, Bridges JFP. Statistical methods for the analysis of discrete-choice experiments: a report of the ISPOR Conjoint Analysis Good Research Practices Task Force. Value Health. 2016;19(4):300–15.

    Article  PubMed  Google Scholar 

  5. Johnson FR, Lancsar E, Marshall DA, Kilambi V, Mühlbacher A, Regier DA, Bresnahan BW, Kanninen B, Bridges JFP. Constructing experimental designs for discrete-choice experiments: report of the ISPOR conjoint analysis experimental design good research practices task force. Value Health. 2013;16:3–13.

    Article  Google Scholar 

  6. Bridges JFP, Hauber AB, Marshall DA, Lloyd A, Prosser LA, Regier DA, Johnson FR, Mauskopf J. Conjoint analysis applications in health—a checklist: a report of the ISPOR good research practices for Conjoint Analysis Task Force. Value Health. 2011;14(4):403–13.

    Article  PubMed  Google Scholar 

  7. Determann D, Lambooij MS, Steyerberg EW, de Bekker-Grob EW, de Wit GA. Impact of survey administration mode on the results of a health-related discrete choice experiment: online and paper comparison. Value Health. 2017;20(7):953–60.

    Article  PubMed  Google Scholar 

  8. Antoun C, Couper MP, Conrad FG. Effects of mobile versus PC web on survey response quality: a crossover experiment in a probability web panel. Public Opin Q. 2017;81(S1):280–306.

    Article  Google Scholar 

  9. Skeje AM, Lindhjem H, Skjelfo S, Navrud S. Smartphone and tablet effects in contingent valuation web surveys—no reason to worry? Ecol Econ. 2019;165: 106930.

    Google Scholar 

  10. Bruijne M De, Oudejans M. Online surveys and the burden of mobile responding. In: Survey Measurement: Techniquest, Data Quality and Sources of Error Engel (Ed). Campus Verlag Frankfurt; 2015.

  11. Kahneman D, Tversky A. Choices, values and frames. New York: Cambridge University Press; 2000.

    Book  Google Scholar 

  12. Gigerenzer G, Todd P. ABC Research Group. New York: Simple heuristics make us smart. Evolution and Cognition. Oxford University Press; 1999.

    Google Scholar 

  13. Veldwijk J, Marceta SM, Swait JD, Lipman SA, de Bekker-Grob EW. Taking the shortcut: simplifying heuristics in discrete choice experiments. Patient. 2023;16(4):301–15.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Rakotonarivo OS, Schaafsma M, Hockley N. A systematic review of the reliability and validity of discrete choice experiments in valuing non-market environmental goods. J Environ Manage. 2016;183:98–109.

    Article  PubMed  Google Scholar 

  15. Hanley N, Mourato S, Wright RE. Choice modelling approaches: a superior alternative for environmental valuation? J Econ Surv. 2001;15:435–62.

    Article  Google Scholar 

  16. Loomis JB. Strategies for overcoming hypothetical bias in stated preference surveys. J Agric Resour Econ. 2014;39:34–46.

    Google Scholar 

  17. Hensher D, Rose JM, Greene WH. Applied choice analysis: second edition. Cambridge: Cambridge University Press; 2015.

  18. Johnston RJ, Boyle KJ, Loureiro ML, Navrud S, Rolfe J. Guidance to enhance the validity and credibility of environmental benefit transfers. Environ Resour Econ. 2021;79:575–624.

    Article  Google Scholar 

  19. Veldwijk J, Essers BAB, Lambooij MS, Dirksen CD, Smit HA, de Wit GA. Survival or mortality: does risk attribute framing influence decision-making behavior in a discrete choice experiment? Value Health. 2016;19(2):202–9.

    Article  PubMed  Google Scholar 

  20. Slovic P. Perception of risk. Science. 1987;236:280–5.

    Article  CAS  PubMed  Google Scholar 

  21. Bonner C, Trevena LJ, Gaissmaier W, et al. Current best practice for presenting probabilities in patient decision aids: fundamental principles. Med Decis Mak. 2021;41(7):821–33.

    Article  Google Scholar 

  22. Lipkus IM. Numeric, verbal, and visual formats of conveying health risks: suggested best practices and future recommendations. Med Decis Mak. 2007;27(5):696–713.

    Article  Google Scholar 

  23. Woloshin S, Schwartz LM. Communicating data about the benefits and harms of treatment: a randomized trial. Ann Intern Med. 2011;155(2):87–96.

    Article  PubMed  Google Scholar 

  24. Bateman IJ, Day BH, Jones AP, Jude S. Reducing gain-loss asymmetry: a virtual reality choice experiment valuing land use change. J Environ Econ Manag. 2009;58:106–18.

    Article  Google Scholar 

  25. Hoffmann S, Winter J, Caro FG, Gottlieb AS. Effects of video enhancement in a stated-choice experiment on medical decision making. Gerontology Institute Publications; 2014. Paper 107.

  26. Charvin M, Launoy G, Berchi C. The effect of information on prostate cancer screening decision process: a discrete choice experiment. BMC Health Serv Res. 2020;20(1):467. https://doi.org/10.1186/s12913-020-05327-x.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  27. Vass CM, Davison NJ, Vander Stichele G, Payne K. A picture is worth a thousand words: the role of survey training materials in stated-preference studies. Patient. 2020;13(2):163–73. https://doi.org/10.1007/s40271-019-00391-w.PMID:31565784;PMCID:PMC7075825.

    Article  PubMed  Google Scholar 

  28. Lim SL, Yang JC, Ehrisman J, Havrilesky LJ, Reed SD. Are videos or text better for describing attributes in stated-preference surveys? Patient. 2020;13(4):401–8.

    Article  PubMed  Google Scholar 

  29. Smith IP, Whichello CL, de Bekker-Grob EW, Mölken MPMHR, Veldwijk J, de Wit GA. the impact of video-based educational materials with voiceovers on preferences for glucose monitoring technology in patients with diabetes: a randomised study. Patient. 2023;16(3):223-237. https://doi.org/10.1007/s40271-022-00612-9.

  30. Westera W, Nadolski R, Hummel H, et al. Serious games for higher education: a framework for reducing design complexity. J Comput Assist Learn. 2008;24(5):420–32.

    Article  Google Scholar 

  31. McFadden D. Conditional logit analysis of qualitative choice behaviour. In: Zarembka P (ed) Frontiers in econometrics. Academic Press: New York; 1974.

  32. Dillman DA, Gertseva A, Mahon-Haf T. Achieving usability in establishment surveys through the application of visual design principles. J Off Stat. 2005;21:183–214.

    Google Scholar 

  33. Keusch F, Yan T. Web versus mobile web: an experimental study of device effects and self-selection effects. Soc Sci Comput Rev. 2017;35(6):751–69.

    Article  Google Scholar 

  34. Hartman JD, Craig BM. Does device or connection type affect health preferences in online surveys? Patient. 2019;12(6):639–50.

    Article  PubMed  Google Scholar 

  35. Vass CM, Boeri M. Mobilising the next generation of stated-preference studies: the association of access device with choice behaviour and data quality. Patient. 2021;14(1):55–63.

    Article  PubMed  Google Scholar 

  36. Jonker MF, Donkers B, de Bekker-Grob EW, Stolk EA. Effect of level overlap and color coding on attribute non-attendance in discrete choice experiments. Value Health. 2018;21(7):767–71.

    Article  PubMed  Google Scholar 

  37. Jonker MF, Donkers B, de Bekker-Grob E, Stolk EA. Attribute level overlap (and color coding) can reduce task complexity, improve choice consistency, and decrease the dropout rate in discrete choice experiments. Health Econ. 2019;28(3):350–63.

    Article  PubMed  Google Scholar 

  38. Norman R, Viney R, Aaronson NK, Brazier JE, Cella D, Costa DS, Fayers PM, Kemmler G, Peacock S, Pickard AS, Rowen D, Street DJ, Velikova G, Young TA, King MT. Using a discrete choice experiment to value the QLU-C10D: feasibility and sensitivity to presentation format. Qual Life Res. 2016;25(3):637–49.

    Article  CAS  PubMed  Google Scholar 

  39. Jonker MF, Attema AE, Donkers B, Stolk EA, Versteegh MM. Are health state valuations from the general public biased? A test of health state reference dependency using self-assessed health and an efficient discrete choice experiment. Health Econ. 2017;26(12):1534–47. https://doi.org/10.1002/hec.3445.

    Article  PubMed  Google Scholar 

  40. Jonker MF, Donkers B, De Bekker-Grob EW, Stolk EA. Advocating a paradigm shift in health-state valuations: The estimation of time-preference corrected QALY tariffs. Value Health. 2018;21(8):993–1001.

    Article  PubMed  Google Scholar 

  41. Nuñez JR, Anderton CR, Renslow RS. Optimizing colormaps with consideration for color vision deficiency to enable accurate interpretation of scientific data. PLoS ONE. 2018;13(7): e0199239.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Vass C, Rigby D, Campbell S, Tate K, Stewart A, Payne K. Investigating the framing of risk attributes in a discrete choice experiment: an application of eye-tracking and think aloud. Med Decis Mak. 2015;35(1):E99.

    Google Scholar 

  43. Veldwijk J, Lambooij MS, van Til JA, Groothuis-Oudshoorn CG, Smit HA, de Wit GA. Words or graphics to present a discrete choice experiment: does it matter? Patient Educ Couns. 2015;98(11):1376–84.

    Article  PubMed  Google Scholar 

  44. Richter R, Jansen J, Bongaerts I, Damman O, Rademakers J, van der Weijden T. Communication of benefits and harms in shared decision making with patients with limited health literacy: a systematic review of risk communication strategies. Patient Educ Couns. 2023;17(116): 107944.

    Article  Google Scholar 

  45. Vass C, Boeri M, Karim S, Marshall DA, Craig B, Ho KA, Mott D, Ngorsuraches S, Badawy SM, Muhlbacher A, Gonzalez MJ, Heidenreich S. Accounting for preference heterogeneity in discrete-choice experiments: an ISPOR special interest group report. Value Health. 2022;25(5):685–94.

    Article  PubMed  Google Scholar 

  46. Janssen EM, Marshall DA, Hauber AB, Bridges JFP. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability? Expert Rev Pharmacoecon Outcomes Res. 2017;17(6):531–42.

    Article  PubMed  Google Scholar 

  47. Johnson FR, Yang J-C, Reed SD. The internal validity of discrete choice experiment data: a testing tool for quantitative assessments. Value Health. 2019;22(2):157–60.

    Article  PubMed  Google Scholar 

  48. Veldwijk J, van der Heide I, Rademakers J, Schuit AJ, de Wit GA, Uiters E, Lambooij MS. Preferences for vaccination: does health literacy make a difference? Med Decis Mak. 2015;35(8):948–58.

    Article  Google Scholar 

  49. Reyna VF, Brainerd CJ. The importance of mathematics in health and human judgment: numeracy, risk communication, and medical decision making. Learn Individ Differ. 2007;17(2):147–59.

    Article  Google Scholar 

  50. Dykhuis MAKE, Slowik PDL, Bryce PDK, Hyde-Nolan PDME, Eshelman PDA, Miller-Matero PDLR. A new measure of health numeracy: brief medical numbers test (BMNT). Psychosomatics. 2018.

  51. Schapira MM, Walker CM, Cappaert KJ, Ganschow PS, Fletcher KE, McGinley EL, Del Pozo S, Schauer C, Tarima S, Jacobs EA. The numeracy understanding in medicine instrument: a measure of health numeracy developed using item response theory. Med Decis Mak. 2012;32(6):851–65.

    Article  Google Scholar 

  52. Schapira MM, Walker CM, Miller T, Fletcher KE, Ganschow PS, Jacobs EA, Imbert D, O’Connell M, Neuner JM. Development and validation of the numeracy understanding in Medicine Instrument short form. J Health Commun. 2014;19(Suppl 2):240–53.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Fagerlin A, Zikmund-Fisher BJ, Ubel PA, Jankovic A, Derry HA, Smith DM. Measuring numeracy without a math test: development of the Subjective Numeracy Scale (SNS). Med Decis Making. 2007;27:672–80.

    Article  PubMed  Google Scholar 

  54. McNaughton CD, Cavanaugh KL, Kripalani S, Rothman RL, Wallston KA. Validation of a Short, 3-Item Version of the Subjective Numeracy Scale. Med Decis Mak. 2015;35(8):932–6.

    Article  Google Scholar 

  55. Health Literacy Tool Shed. Health literacy tool shed: a database of health literacy measures. 2018. http://healthliteracy.bu.edu/. Accessed 21 July 2023.

  56. Chew LD, Griffin JM, Partin MR, et al. Validation of screening questions for limited health literacy in a large VA outpatient population. J Gen Intern Med. 2008;23(5):561–6.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Cokely ET, Galesic M, Schulz E, Ghazal S, Garcia-Retamero R. Measuring risk literacy: the Berlin Numeracy Test. Judgm Decis Mak. 2012;7(1):25–47.

    Article  Google Scholar 

  58. Galesic M, Garcia-Retamero R. Graph literacy: a cross-cultural comparison. Med Decis Mak. 2011;31:444–57.

    Article  Google Scholar 

  59. Okan Y, Janssen E, Galesic M, Waters EA. Using the short graph literacy scale to predict precursors of health behavior change. Med Decis Mak. 2019;39(3):183–95.

    Article  Google Scholar 

  60. Russo S, Jongerius C, Faccio F, Pizzoli SFM, Pinto CA, Veldwijk J, Janssens R, Simons G, Falahee M, de Bekker-Grob E, Huys I, Postmus D, Kihlbom U, Pravettoni G. Understanding patients’ preferences: a systematic review of psychological instruments used in patients’ preference and decision studies. Value Health. 2019;22(4):491–501.

    Article  PubMed  Google Scholar 

  61. Whitty JA, Walker R, Golenko X, Ratcliffe J. A think aloud study comparing the validity and acceptability of discrete choice and best worst scaling methods. PLoS ONE. 2014;9(4): e90635.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Bridges J, Oakes A, Renhart C, Byard E, O’Donoghue B. Developing and piloting an instrument to prioritize the worries of patients with acute myeloid leukemia. Patient Prefer Adherence. 2018;12:647–55.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Flynn TN, Marley AAJ, Louviere JJ. Best-worst scaling: theory, methods and applications. Cambridge: Cambridge University Press; 2015.

    Google Scholar 

  64. Collins D. Pretesting survey instruments: an overview of cognitive methods. Qual Life Res. 2003;12(3):229–38.

    Article  PubMed  Google Scholar 

  65. Mitchell RC, Carson RT. Using surveys to value public goods—the contingent valuation method. New York: Resources for the Future; 1993.

  66. Klose T. The contingent valuation method in health care. Health Policy. 1999;47(2):97–123.

    Article  CAS  PubMed  Google Scholar 

  67. Ruel E, Wagner WI, Gillespie B. Pretesting and pilot testing. In: Ruel E, editor. The practice of survey research: theory and applications. SAGE Publications, Inc.; 2015.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ellen M. Janssen.

Ethics declarations

Funding

No financial support was provided to support this manuscript.

Conflict of interest

Dr. Janssen is an employee of Johnson & Johnson and holds stock in Johnson & Johnson. Dr. Reed maintains a detailed listing of research funding and financial disclosures at: https://scholars.duke.edu/person/shelby.reed. Dr. Marshall reports personal fees from Analytica and Novartis and expenses from Illumina. Dr. Veldwijk has nothing to declare.

Author contributions

All authors have been fully involved in the study design, collection, analysis, and interpretation of data, and writing of this article. All authors have approved the manuscript and agreed with its submission.

Data availability statement

Not applicable.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 24 KB)

Supplementary file2 (DOCX 19 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Marshall, D.A., Veldwijk, J., Janssen, E.M. et al. Stated-Preference Survey Design and Testing in Health Applications. Patient (2024). https://doi.org/10.1007/s40271-023-00671-6

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40271-023-00671-6

Navigation