This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
Over the last two decades, patient review websites have emerged as an essential online platform for doctor ratings and reviews. Recent studies suggested the significance of such websites as a data source for patients to choose doctors for healthcare providers to learn and improve from patient feedback and to foster a culture of trust and transparency between patients and healthcare providers. However, as compared to other medical specialties, studies of online patient reviews that focus on dentists in the United States remain absent.
This study sought to understand to what extent online patient reviews can provide performance feedbacks that reflect dental care quality and patient experience.
Using mixed informatics methods incorporating statistics, natural language processing, and domain expert evaluation, we analyzed the online patient reviews of 204,751 dentists extracted from HealthGrades with two specific aims. First, we examined the associations between patient ratings and a variety of dentist characteristics. Second, we identified topics from patient reviews that can be mapped to the national assessment of dental patient experience measured by the Patient Experience Measures from the Consumer Assessment of Healthcare Providers and Systems (CAHPS) Dental Plan Survey.
Higher ratings were associated with female dentists (
These findings suggest that online patient reviews could be used as a data source for understanding the patient experience and healthcare quality in dentistry.
Over the last few years, patient review websites have gained increasing interest among health consumers, academic communities, and healthcare providers [
Despite debates on whether patient-generated reviews would be useful to improve healthcare quality [
Online reviews are also rich in data on the patient experience, a critical measure of healthcare quality [
Online patient reviews can inform better decision-making among patients and can be used to improve healthcare quality. As such, many studies on online patient reviews in a variety of medical specialties have been reported [
Our second aim was to understand to what extent online reviews can inform assessments of the patient experience by identifying semantic mentions of the patient experience in patient reviews. Dental patient experience is traditionally assessed with the CAHPS (Consumer Assessment of Healthcare Providers and Systems) Dental Plan Survey, administered by CMS [
Publicly accessible online review data were obtained from HealthGrades, a well-known patient review website in the United States. Among many other sites, we focused on HealthGrades for two reasons. First, HealthGrades is widely used by patients who receive healthcare services from a full range of medical specialties in the United States. Second, HealthGrades provides a well-organized sitemap structure that facilitates data extraction. We analyzed data from a single review site because the data structure and measures of dentist demographics and performance vary, which will hinder data consolidation. In addition, data from multiple patient review sites would have little impact on representation and generalizability of this study because an active dentist typically has profiles on all popular patient review sites.
Online reviews for 204,751 dentists were extracted. This census approximates but does not fully match the workforce statistics (199,486 working dentists as of 2018) reported by the American Dental Association (ADA) [
There were 41 dental service listings in HealthGrades. We categorized these services into the 10 dental specialties defined by the ADA, general dentistry, and others (ie, unidentified and miscellaneous), resulting in 8 specialties (ie, dental anesthesiology, endodontics, oral and maxillofacial pathology, oral and maxillofacial surgery, orthodontics and dentofacial orthopedics, pediatric dentistry, periodontics, and prosthodontics), general dentistry, and others for downstream analyses. Public health dentistry and oral and maxillofacial radiology were excluded as they have only one entry at most. The data extraction was completed in September 2019. The study was identified as a nonhuman study by the Institutional Review Board of the University of South Carolina.
We employed statistical analyses to assess the associations between ratings and dentists’ characteristics using R Project for Statistical Computing. We used descriptive statistics to calculate proportions and mean distributions. Based on reported online reviews studies in general medical specialties, we hypothesized that (1) female dentists, (2) young dentists, and (3) short wait times would be associated with higher overall ratings. We also hypothesized that specialties are associated with overall ratings. An independent sample t-test was used to test whether ratings differ by gender. Analysis of Variance (ANOVA) was used to test whether ratings differ by specialties, age, and wait time, respectively. We used Hedges
Semi-automated natural language processing was used to identify concepts related to patient experience with limited human labor required.
We then calculated bigram and trigram collocations. Collocations are habitual expressions of multiple words. In this study, we ranked using the “likelihood ratio” method [
We observed a number of the collocations that are irrelevant to concepts of dental care and patient experience (eg, “phone call” and “many years ago”) but were still ranked top 200 by likelihood ratio. Therefore, two raters (YL and CL) independently picked collocations related to patient experience using a 4-point Likert scale (“definitely relevant,” “somewhat relevant,” “somewhat irrelevant,” “definitely irrelevant”). Inter-rater reliability was assessed using Cohen kappa. Two raters discussed on the collocations that received contrary opinions (relevant vs irrelevant) until a consensus was reached.
Next, we mapped patient-experience-related concepts onto a total of 17 composite measures from the Patient Experience Measures from the CAHPS Dental Plan Survey.
Diagram of text mining procedures.
Composite measures from the Patient Experience Measures from the CAHPS Dental Plan Survey.
Dimensions, items | Composite measures | |
|
||
|
Q6 | How often did your regular dentist explain things in a way that was easy to understand? |
Q7 | How often did your regular dentist listen carefully to you? | |
Q8 | How often did your regular dentist treat you with courtesy and respect? | |
Q9 | How often did your regular dentist spend enough time with you? | |
Q11 | How often did the dentists or dental staff do everything they could to help you feel as comfortable as possible during your dental work? | |
Q12 | How often did the dentists or dental staff explain what they were doing while treating you? | |
|
||
|
Q13 | How often were your dental appointments as soon as you wanted? |
Q15 | If you tried to get an appointment for yourself with a dentist who specializes in a particular type of dental care (such as root canals or gum disease) in the last 12 months, how often did you get an appointment as soon as you wanted? | |
Q16 | How often did you have to spend more than 15 minutes in the waiting room before you saw someone for your appointment? | |
Q17 | If you had to spend more than 15 minutes in the waiting room before you saw someone for your appointment, how often did someone tell you why there was a delay or how long the delay would be? | |
Q14 | If you needed to see a dentist right away because of a dental emergency in the last 12 months, did you get to see a dentist as soon as you wanted? | |
|
||
|
Q19 | How often did your dental plan cover all of the services you thought were covered? |
Q22 | How often did the 800 number, written materials, or website provide the information you wanted? | |
Q27 | How often did your dental plan’s customer service give you the information or help you needed? | |
Q28 | How often did your dental plan’s customer service staff treat you with courtesy and respect? | |
Q20 | Did your dental plan cover what you and your family needed to get done? | |
Q24 | Did this information (from your dental plan) help you find a dentist you were happy with? |
Dentist demographics.
|
Count | Proportion (%) | |
|
|||
|
Female | 58,309 | 28.48 |
Male | 146,044 | 71.33 | |
Unknown | 398 | 0.19 | |
|
|||
|
Under 30 | 1585 | 0.77 |
30-39 | 28,736 | 14.03 | |
40-49 | 36,715 | 17.93 | |
50-59 | 29,006 | 14.17 | |
60-69 | 29,585 | 14.45 | |
70-79 | 11,716 | 5.72 | |
80-89 | 1826 | 0.89 | |
Over 89 | 157 | 0.08 | |
Unknown | 65,425 | 31.95 | |
|
|||
|
Dental Anesthesiology | 338 | 0.17 |
Endodontics | 7803 | 3.81 | |
General Dentistry | 160,831 | 78.55 | |
Oral and Maxillofacial Pathology | 166 | 0.08 | |
Oral and Maxillofacial Radiology | 1 | 0.00 | |
Oral and Maxillofacial Surgery | 975 | 0.48 | |
Orthodontics and Dentofacial Orthopedics | 18,891 | 9.23 | |
Pediatric Dentistry | 9743 | 4.76 | |
Periodontics | 1541 | 0.75 | |
Prosthodontics | 4249 | 2.08 | |
Other | 213 | 0.10 | |
Unknown | 0 | 0 | |
|
|||
|
Spanish | 3972 | 1.94 |
Hindi | 510 | 0.25 | |
Arabic | 471 | 0.23 | |
French | 467 | 0.23 | |
Chinese | 440 | 0.21 | |
Russian | 429 | 0.21 | |
Farsi | 338 | 0.17 | |
Vietnamese | 305 | 0.15 | |
Korean | 304 | 0.15 | |
Portuguese | 272 | 0.13 | |
Unknown | 188,025 | 91.83 | |
|
|||
|
1-1.9 | 1344 | 0.66 |
2-2.9 | 2768 | 1.35 | |
3-3.9 | 16,431 | 8.02 | |
4-4.9 | 61,520 | 30.05 | |
5 | 72,620 | 35.47 | |
Unknown | 50,068 | 24.45 | |
|
|||
|
Under 10 minutes | 98,104 | 47.91 |
10-15 minutes | 46,347 | 22.64 | |
16-30 minutes | 4395 | 2.15 | |
31-45 minutes | 880 | 0.43 | |
Over 45 minutes | 335 | 0.16 | |
Unknown | 54,690 | 26.71 |
ANOVA showed no significant effect of specialties on ratings (
We also found a significant effect of age on ratings (
There was also a significant effect of wait time on ratings (F4, 150055=10417.77,
Overall ratings by gender.
Overall ratings by age group.
Overall ratings by wait time.
Cohen kappa (equally weighted) was 0.95 between the two raters who independently identified patient experience-related words and phrases from 2000 automatically extracted collocations. After discussion, they identified 29 words and phrases, then two other reviewers of dental experts independently mapped the 29 words and phrases onto the 17 composite measures in the Patient Experience Measures from the CAHPS Dental Plan Survey.
Mapping of Patient Experience Measures from the CAHPS Dental Plan Survey and the words and phrases extracted from patient reviews.
Over the last few years, researchers have begun a systematic analysis of online patient reviews. In the United States, several empirical studies investigating online reviews of general healthcare services and specialties are well documented, but such studies have not been performed in dentistry [
We found several factors associated with overall dentist ratings. In particular, female dentists were rated slightly higher than their male counterparts, although this should be interpreted with the understanding that the effect size is small (
The proliferation of patient review websites represents a wealth of patient-experience data, but these data remain understudied. In this study, we identified unstructured descriptions of patient experience using a method integrating quantitative text mining and qualitative human evaluation. Our method recognizes the role of automated textual data analytics in harnessing information from online reviews, consistent with other recent studies [
Our study has the following limitations. First, data from Healthgrades, like data from any other patient review website, may be incomplete and biased. Not all dentist profiles have been claimed. Although authentication is required for dentist profile information, inaccuracies may still exist. The overall ratings may be biased because patients who are happy with health services are more likely to leave ratings and reviews [
Despite these limitations, this study analyzed an extensive dataset and found an association between dentists and online patient reviews. The thematic analysis also identified themes of patient experience similar to those of CAHPS, suggesting that online patient reviews can inform improved quality in dental care.
This study demonstrated that PORs are a potential data source that can supply rich performance data from the patient perspective, based on which assessments of dental care quality and the patient experience is feasible. We also identified several factors associated with dentists’ overall ratings, which could be used to inform dental care quality improvement.
American Dental Association
Analysis of Variance
Consumer Assessment of Healthcare Providers and Systems
Centers for Medicare and Medicaid Services
Tukey’s Honestly Significant Difference
We thank Dr Jenny S Tjahjono for the contribution to the expert evaluation. This study is supported by a seed grant from the University of South Carolina.
None declared.