Software for administering the National Cancer Institute's patient-reported outcomes version of the common terminology criteria for adverse events: Usability study Journal Article


Authors: Schoen, M. W.; Basch, E.; Hudson, L. L.; Chung, A. E.; Mendoza, T. R.; Mitchell, S. A.; St. Germain, D.; Baumgartner, P.; Sit, L.; Rogak, L. J.; Shouery, M.; Shalley, E.; Reeve, B. B.; Fawzy, M. R.; Bhavsar, N. A.; Cleeland, C.; Schrag, D.; Dueck, A. C.; Abernethy, A. P.
Article Title: Software for administering the National Cancer Institute's patient-reported outcomes version of the common terminology criteria for adverse events: Usability study
Abstract: Background: The US National Cancer Institute (NCI) developed software to gather symptomatic adverse events directly from patients participating in clinical trials. The software administers surveys to patients using items from the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE) through Web-based or automated telephone interfaces and facilitates the management of survey administration and the resultant data by professionals (clinicians and research associates). Objective: The purpose of this study was to iteratively evaluate and improve the usability of the PRO-CTCAE software. Methods: Heuristic evaluation of the software functionality was followed by semiscripted, think-aloud protocols in two consecutive rounds of usability testing among patients with cancer, clinicians, and research associates at 3 cancer centers. We conducted testing with patients both in clinics and at home (remotely) for both Web-based and telephone interfaces. Furthermore, we refined the software between rounds and retested. Results: Heuristic evaluation identified deviations from the best practices across 10 standardized categories, which informed initial software improvement. Subsequently, we conducted user-based testing among 169 patients and 47 professionals. Software modifications between rounds addressed identified issues, including difficulty using radio buttons, absence of survey progress indicators, and login problems (for patients) as well as scheduling of patient surveys (for professionals). The initial System Usability Scale (SUS) score for the patient Web-based interface was 86 and 82 (P=.22) before and after modifications, respectively, whereas the task completion score was 4.47, which improved to 4.58 (P=.39) after modifications. Following modifications for professional users, the SUS scores improved from 71 to 75 (P=.47), and the mean task performance improved significantly (4.40 vs 4.02; P=.001). Conclusions: Software modifications, informed by rigorous assessment, rendered a usable system, which is currently used in multiple NCI-sponsored multicenter cancer clinical trials. © Martin W Schoen, Ethan Basch, Lori L Hudson, Arlene E Chung, Tito R Mendoza, Sandra A Mitchell, Diane St. Germain, Paul Baumgartner, Laura Sit, Lauren J Rogak, Marwan Shouery, Eve Shalley, Bryce B Reeve, Maria R Fawzy, Nrupen A Bhavsar, Charles Cleeland, Deborah Schrag, Amylou C Dueck, Amy P Abernethy.
Keywords: patient-reported outcomes; symptoms; adverse events; pro-ctcae; cancer clinical trials; usability
Journal Title: JMIR Human Factors
Volume: 5
Issue: 3
ISSN: 2292-9495
Publisher: JMIR Publications, Inc  
Date Published: 2018-07-01
Start Page: e10070
Language: English
DOI: 10.2196/10070
PROVIDER: scopus
PMCID: PMC6066634
PUBMED: 30012546
DOI/URL:
Notes: Conference Paper -- Export Date: 1 October 2018 -- Source: Scopus
Altmetric
Citation Impact
BMJ Impact Analytics
MSK Authors
  1. Ethan Martin Basch
    180 Basch
  2. Laura S Sit
    22 Sit
  3. Lauren Jayne Rogak
    76 Rogak
  4. Marwan Shouery
    11 Shouery