Abstract
As an increasing number of patients seek medical information online, it is crucial to bring expert-vetted information to patients. Multiple expert-vetted online resources exist for their accuracy and authority of medical knowledge. However, it is not clear which resource is better in meeting the information needs of the patients. Utilizing a collection of questions raised by patients with diabetes retrieved from an online forum, we manually evaluated three widely used expert-vetted online resources (WebMD, MedlinePlus, and UpToDate) in terms of content coverage and time spent to find answers. The results indicated that WebMD had slightly better content coverage with less time spent in finding answers. Leveraging the Natural Language Processing (NLP) techniques and a clinical terminology resource, the Unified Medical Language System (UMLS), we demonstrated that WebMD had a higher mapping rate of clinical concepts comparing to other resources.