Despite the importance of high-quality and patient-centered substance use disorder treatment, there are no standardized ratings of specialized drug treatment facilities and their services. Online platforms offer insights into potential drivers of high and low patient experience.
We sought to analyze publicly available online review content of specialized drug treatment facilities and identify themes within high and low ratings.
This was a retrospective analysis of online ratings and reviews of specialized drug treatment facilities in Pennsylvania listed within the 2016 National Directory of Drug and Alcohol Abuse Treatment Facilities. Latent Dirichlet Allocation, a machine learning approach to narrative text, was used to identify themes within reviews. Differential Language Analysis was then used to measure correlations between themes and star ratings.
Online reviews of Pennsylvania’s specialized drug treatment facilities posted to Google and Yelp (July 2010–August 2018).
A total of 7823 online ratings were posted over 8 years. The distribution was bimodal (43% 5-star and 34% 1-star). The average weighted rating of a facility was 3.3 stars. Online themes correlated with 5-star ratings were the following: focus on recovery (r = 0.53), helpfulness of staff (r = 0.43), compassionate care (r = 0.37), experienced a life-changing moment (r = 0.32), and staff professionalism (r = 0.29). Themes correlated with a 1-star rating were waiting time (r = 0.41), poor accommodations (0.26), poor phone communication (r = 0.24), medications given (0.24), and appointment availability (r = 0.23). Themes derived from review content were similar to 9 of the 14 facility-level services highlighted by the Substance Abuse and Mental Health Services Administration’s National Survey of Substance Abuse Treatment Services.
Individuals are sharing their ratings and reviews of specialized drug treatment facilities on online platforms. Organically derived reviews of the patient experience, captured by online platforms, reveal potential drivers of high and low ratings. These represent additional areas of focus which can inform patient-centered quality metrics for specialized drug treatment facilities.
Over 20 million US adults (8.4% of the population) have substance use disorder (SUD) and drug overdose has overtaken motor vehicle crashes as the leading cause of accidental death.1 While there are many evidence-based SUD programs,2 the overall quality of treatment within facilities treating SUD in the USA has not been well defined or measured. An important component in the development of effective quality measures requires understanding the drivers of positive and negative patient experiences within SUD programs. Understanding these qualitative narratives can provide a singular and important component in the development of quality metrics for SUD programs in a patient-centered approach.
Digital health platforms offer new opportunities to capture and share these experiences.3 Patients, and their support networks, may begin an initial search for treatment options via the web. When researching a treatment facility’s location or services, they often will encounter an online review or rating provided by common platforms such as Google or Yelp.4, 5
In health domains, these online reviews correlate strongly with reviews obtained through more systematic surveys, such as the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS).6 Star-based ratings can also drive consumer choice for healthcare. For example, after a star-based rating system of nursing homes was released, 1-star facilities typically lost 8% of their market share and 5-star facilities gained more than 6% of their market share.7,8,9 Online star ratings exist for specialized drug treatment facilities, but little is known about the distribution, content, and implications of these reviews. Online reviews may provide a unique opportunity to help guide agencies by identifying common themes and areas of focus for SUD programs.
We sought to describe and analyze online ratings and reviews of specialized drug treatment facilities (SDTFs). SDTFs are facilities which provide coordinated and specialized care, inpatient or outpatient, for individuals with substance abuse disorder. We focused on online reviews of SDTFs in Pennsylvania. Pennsylvania has been a focal point for addressing OUD and overdose-related deaths. In 2016, Pennsylvania’s death rate was 18.5 deaths per 100,000 persons, compared with the national rate of 13.3 deaths per 100,000; the state remains in the top 25% for overall death rates.10 Moreover, among the nation’s 44 counties with at least 1 million residents, Philadelphia County and Allegheny County (includes Pittsburgh) had the two highest rates of overdose deaths.11 We used machine learning to automate the identification of patient-centered themes within online ratings. We compared these themes from online reviews with the existing surveys assessing services provided by individual facilities, collected by the Substance Abuse and Mental Health Services Administration (SAMHSA).
This was a retrospective analysis of all online reviews and ratings posted to Google and Yelp of SDTFs within Pennsylvania. No reviews were excluded. All Pennsylvania SDTFs were identified using the 2016 National Directory of Drug and Alcohol Abuse Treatment Facility Record published by SAMHSA.12 If a facility exists in Pennsylvania but is not formally registered with SAMHSA, then this facility was not included in the analysis. This study was considered exempt by the University of Pennsylvania Institutional Review Board.
Identification of Review Themes
We identified the initial set of Pennsylvania’s SDTFs from the 2016 SAMHSA National Directory of Drug and Alcohol Abuse Treatment Facilities. Yelp and Google ratings and reviews posted for each facility were then identified. Yelp is a dedicated rating and review platform. We used Yelp’s Application Program Interface (API) to identify Yelp13 reviews of SDTFs across Pennsylvania from July 2010 to August 2018. Google, primarily a search engine, incorporates star ratings and reviews into search results which users are presented with when conducting an online search. Google users may enter a rating and review directly through the website, where users are prompted to provide a star rating and an optional narrative review. We manually searched the Google Places pages Pennsylvania SDTFs, to identify Google ratings and reviews when available from July 2010 to August 2018.
Yelp and Google use a proprietary algorithm that considers measures of quality, reliability, and activity to estimate the authenticity of each review.14 For example, this algorithm would prevent a user from leaving multiple reviews on a business page under different usernames. Additionally, each platform states that businesses which buy advertising space on the site are unable to influence which reviews are recommended.
We used Latent Dirichlet Allocation (LDA) to identify the themes expressed within the online reviews. LDA is an automated machine learning process that identifies co-occurrences of words in narrative text, to generate topics.15 It has been used in several studies examining online reviews of hospitals.3, 4, 16 LDA provides a method to use computer-based software and programming to analyze and identify themes in large data sets comprised of narrative text such as online reviews or Twitter tweets. In this study, we chose to focus on extremes of ratings (low or 1-star and high or 5-star). We chose to examine the extremes of the scale in this study as to maximize the possibility of identifying potential drivers of high or low ratings. The next step included applying LDA to identify topics. This process analyzes low- or high-star reviews and searches for co-occurring words in one user review which are then found across multiple reviews to create an a priori number of groups of words which can then be labeled by the research team. In this study, three study team members (AA, VW, RM) independently labeled topics which were found to be similar. Of note as the number of groups becomes smaller, the granularity of the data is lost and as the number of groups becomes too large, the detail across groups becomes lost. The implementation of LDA was provided by the MALLET package.15 LDA has been used in various studies examining online reviews of hospitals.3, 4, 16
Association of Review Themes and Review Star Ratings
We used Differential Language Analysis (DLA),17 to determine which themes were most correlated (Pearson’s r) to the online ratings. Differential Language Analysis provides a method of open-vocabulary analysis to calculate the topic distribution of the reviews and determine which topics were most correlated to online review ratings. DLA builds upon the topics generated using LDA to identify correlation. LDA analysis here provides the topics identified within one-star and five-star reviews as described above. DLA then allows for an analysis to determine which of the LDA topics have a true correlation to one- or five-star reviews. The end product of DLA provides a correlation coefficient from 0 to 1 correlating the topic with the star rating. Due to the bimodal nature of online reviews,13, 14 we first calculated correlations between themes and a binary variable indicating if the review was a 1-star review or not and then calculated correlations between the themes and a binary variable indicating if the review was a 5-star review or not.
Comparing Online Review Themes with Systematic Survey Data
We calculated correlation coefficients between the themes derived from high or low online reviews and service data from the 2017 National Survey of Substance Abuse Treatment Services (NSSATS), administered annually by SAMHSA. The NSSATS assesses individual US facilities in 3 domains: (1) characteristics of individual facilities and treatment types provided; (2) client counts; (3) information such as certification or accreditation. The objective of this analysis was to identify if and how patient-generated themes from online reviews correlated to the current standard approach of assessing facilities’ capabilities through the NSSATS survey. Prior work has demonstrated gaps between the focus of standardized survey domains (e.g., the Hospital Consumer Assessment of Healthcare Providers and Systems) and themes elucidated from patient review data on other platforms such as Yelp or Google.3, 14 Given that there is no current standard for SDTFs, we choose to compare online patient themes to the NSSATS.
There are 539 SDTFs in Pennsylvania listed within the SAMHSA directory; 485 of these facilities (90%) had a rating or review posted during the study period. These 485 sites had 7635 Google ratings and 188 Yelp ratings; 5312 (68%) had narrative reviews. The mean number of ratings per facility was 15.3 (SD 24.0, median 7, IQR [3–16]) and the mean word count within a review was 72.8 words (SD 82.28, median 42 IQR [2–58]).
The distribution of ratings was bimodal: 43% 5-star and 34% 1-star (Fig. 1). The simple mean star rating was 3.24 stars (median 4.0) across all facilities in Pennsylvania. When weighting for varying numbers of reviews per facility, using a Bayesian expected average, the estimated star rating at the facility level was 3.28 stars (median 3.29). Figure 2 displays the distribution of weighted facility ratings. Of the 7823 ratings, 68% (5313) had an accompanying review. The average star rating of those with text reviews was 2.9 stars (SD 1.8, median 4). The average star rating of the 32% (2512) without text reviews was 3.8 stars (SD 1.5, median 5). The count of ratings varies across facilities. Facilities with higher or lower absolute counts of reviews display a range of average star ratings. Figure 3 displays groups of facilities with similar counts of ratings and their weighted average ratings; as the number of reviews per facility increases, we find average star rating to converge towards 3 stars.]-->
Online Review Ratings and Themes
We identified five distinct themes correlated with positive and negative narrative reviews. Themes most positively correlated with 5-star ratings were focus on recovery (r = 0.53), helpfulness of staff (r = 0.43), compassionate care (r = 0.37), experienced a life-changing moment (r = 0.32), and professionalism (r = 0.29). Themes most positively correlated with 1-star reviews were wait time (r = 0.41), poor accommodations (0.26), poor phone communication (r = 0.24), medications offered (0.24), and appointment availability (r = 0.23). Table 1 displays themes, correlation coefficient, and example narrative text from online reviews.
Comparing Themes from Online Reviews with SAMHSA Services
Themes revealed from online reviews and service codes defined by SAMHSA revealed little overlap. While some of the SAMHSA service codes were identified within online reviews, the majority of themes discovered through online reviews were not included within the NSSATS or within SAMHSA’s service codes. Of the 162 listed services, we found an average of 49 services offered per facility (median 46 services).
Of the 10 themes most correlated with an online review, 3 aligned with the 14 SAMHSA facility-level categories and 12 service codes (7% of 162). The 7 other online themes are noted in Table 1. In addition, we compared 1-star and 5-star Google reviews with NSSATS survey data. Table 2 shows correlation between services provided at the facility level across Pennsylvania. The three services most correlated with 1-star ratings were psychiatric emergency walk-in services (r = − 0.17), state-funded SDTFs (r = − 0.16), and integrated primary care service (r = − 0.15). Five-star correlations included outpatient methadone/buprenorphine or naltrexone (r = 0.15); cash or self-pay SDTFs (r = 0.12); and lesbian, gay, bisexual, or transgender (LGBT) clients (r = 0.12). The unique online themes also were strongly correlated with low or high online ratings as compared with the existing NSSAT service codes.
Online star ratings exist for specialized drug treatment facilities. This study provides a state-level analysis of the distribution, content, and implications of these reviews. Online reviews provide a patient-centered opportunity to identifying themes of focus for quality metrics for SUD programs.
This study has three main findings. First, individuals are posting online reviews and contributing to online ratings of specialized drug treatment facilities using multiple platforms. Over an 8-year period, we identified 7823 online reviews covering over 90% of Pennsylvania’s SDTFs. This study is the first to investigate a large volume of online reviews of SDTFs in order to understand the distribution of ratings and the content within text reviews. The initial analysis reveals a bimodal distribution of low and high star ratings, a range of ratings per facility and differences in weighted averages of star ratings. This machine learning approach provides a means of analyzing large amounts of rich qualitative data for both the inpatient and outpatient settings which see high volumes of patients and often struggle to capture high rates of patient feedback.4, 16, 18
Second, the narrative component of online reviews reveals themes driving ratings of SDTFs. Online reviews and ratings of healthcare continue to grow for healthcare services and provide insights on gaps in quality metrics not captured by standardized surveys.4, 14, 19 Our analysis reveals themes not otherwise identified using the current approach. Online reviews reveal patient priorities, with themes such as “focus on recovery” and “compassionate care.” Ratings with an accompanying narrative text review, over two-thirds, were lower as compared with ratings with no additional text. This study suggests comment-based reviews offer a trove of information for facilities to learn directly from patients’ experiences which may be driving lower overall ratings. The positive and negative themes derived from patient reviews identify components of care important to patients but missed by standard approaches including “poor phone communication” and “life changing experiences.” As the opioid epidemic continues to surge across the country, the number of individuals seeking treatment likely will increase. Patients, and their support systems, may lean on advice from their primary care providers to guide them. Themes elucidated from review data may aid primary care providers and provide additional insights to help tailor referrals to future patients.
Third, this study suggests an approach to guide a portion of the development of patient-centered quality metrics for SDTFs. Given that online patient review platforms reveal patient priorities organically, these themes reflect patient-centered concerns.3, 14, 20 These patient-centered concerns should not be the only measures of SDTF quality, but they should sit alongside important measures that are less visible to patients or less evaluable by patients, including structural measures such as staffing, process measures such as fidelity to best evidence, and outcome measures such as risk-adjusted success rates. Policy makers and regulators could incorporate themes from these reviews to help guide the creation of quality metrics for SUD treatment. Indeed, as SUD grows across the country, the need for adherence to evidence-based practice and high-quality programs remains important. This focus has become even more evident as advocacy groups, both public and private groups, such as Shatterproof,21 attempt to create and emphasize quality metrics.22
Both this approach and this study are subject to limitations. Organically derived reviews are unstructured in who completes them and in what is contributed. A typical criticism is that these reviews reflect the opinions of those who have something to say, rather than the broader overall population. Even so, organically derived reviews have the advantage of revealing themes without the pre-direction that structured responses create. Even if they may not be probabilistically representative, they can expand perspectives and, in this case, give voice to a patient population that often has little. That voice can inform the development of systematic surveys, revealing that organic and systematic assessments of quality are more complementary than in conflict. In addition, the opportunity to quickly analyze large amounts of quantitative and qualitative data may help support more in-depth interviews with individuals and focus groups. Unlike systematic surveys, online reviews are not standardized and do not offer the protection from fraudulent responses. And while both online and systematic reviews are likely to favor those who most want to share their opinions, online reviews cater to those who seek out the opportunity to comment. Also, being an open and public forum, any individual is able to post their comments and reviews on Yelp or Google and thus, a service or business may have reviews posted by users who never actually were cared for by the SDTF.
Indeed, we found that some facilities had few or no reviews, making them invisible to organic online processes, a challenge in evaluating online reviews of individual providers.5, 17, 18 Facilities with reviews and those without reviews may be fundamentally different, limiting the ability to extrapolate from available data. Furthermore, identifying a meaningful volume of reviews and ratings remains to be determined. More systematic review processes often oversample to correct or anticipate sparse evaluations.
This study also has strengths. The use of machine learning approaches both automates and, in some ways, objectifies the search for meaning in unstructured narrative. The combination allows the inexpensive analysis of enormous amounts of textual data. The identified themes have immediate face validity, even if they might not have been themes anticipated. Indeed, the ability to identify themes that only in retrospect seem relevant is a key strength of these non-hypothesis-driven approaches.
The nation’s management of SUD requires effective treatment and that, in turn, requires effective quality management. In lieu of quality ratings created and presented more systematically, current patients may use ratings contributed organically to online platforms such as Google and Yelp. While these rating platforms reflect biases, so do more systematic rating systems. This study reveals how data gathered from online review platforms can be aggregated and analyzed to identify themes relevant to patient satisfaction, and may provide a patient-centered approach to building components of quality metrics for specialized drug treatment facilities moving forward.
Rudd RA, Seth P, David F, Scholl L. Increases in drug and opioid-involved overdose deaths - United States, 2010–2015. MMWR Morb Mortal Wkly Rep. 2016;65(5051):1445–1452. doi:https://doi.org/10.15585/mmwr.mm655051e1
Volkow ND, Frieden TR, Hyde PS, Cha SS. Medication-assisted therapies--tackling the opioid-overdose epidemic. N Engl J Med. 2014;370(22):2063–2066. doi:https://doi.org/10.1056/NEJMp1402780
Ranard BL, Werner RM, Antanavicius T, et al. Yelp reviews of hospital care can supplement and inform traditional surveys of the patient experience of care. Health Aff Proj Hope. 2016;35(4):697–705. doi:https://doi.org/10.1377/hlthaff.2015.1030
Agarwal AK, Mahoney K, Lanza AL, et al. Online ratings of the patient experience: emergency departments versus urgent care centers. Ann Emerg Med. November 2018. doi:https://doi.org/10.1016/j.annemergmed.2018.09.029
Lee V. Transparency and trust — online patient reviews of physicians. N Engl J Med. 2017;376(3):197–199. doi:https://doi.org/10.1056/NEJMp1610136
Glover M, Khalilzadeh O, Choy G, Prabhakar AM, Pandharipande PV, Gazelle GS. Hospital evaluations by social media: a comparative analysis of Facebook ratings among performance outliers. J Gen Intern Med. 2015;30(10):1440–1446. doi:https://doi.org/10.1007/s11606-015-3236-3
Kellogg C, Zhu Y, Cardenas V, et al. What consumers say about nursing homes in online reviews. The Gerontologist. 2018;58(4):e273-e280. doi:https://doi.org/10.1093/geront/gny025
Werner RM, Konetzka RT, Polsky D. Changes in consumer demand following public reporting of summary quality ratings: an evaluation in nursing homes. Health Serv Res. 2016;51:1291–1309. doi:https://doi.org/10.1111/1475-6773.12459
Perraillon MC, Konetzka RT, He D, Werner RM. Consumer response to composite ratings of nursing home quality. Am J Health Econ. December 2017:1–36. doi:https://doi.org/10.1162/ajhe_a_00115
Understanding the Epidemic | Drug Overdose | CDC Injury Center. https://www.cdc.gov/drugoverdose/epidemic/index.html. Published August 30, 2017. Accessed September 19, 2019.
Scholl L. Drug and opioid-involved overdose deaths — United States, 2013–2017. MMWR Morb Mortal Wkly Rep. 2019;67. doi:https://doi.org/10.15585/mmwr.mm6751521e1
SAMHSA - Substance Abuse and Mental Health Services Administration. https://www.samhsa.gov/. .
Factsheet. Yelp. https://www.yelp.com/factsheet. Accessed September 19, 2019.
Merchant RM, Volpp KG, Asch DA. Learning by listening-improving health care in the era of Yelp. JAMA. 2016;316(23):2483–2484. doi:https://doi.org/10.1001/jama.2016.16754
Blei DM. Latent Dirichlet Allocation. :30.
Graves RL, Goldshear J, Perrone J, et al. Patient narratives in Yelp reviews offer insight into opioid experiences and the challenges of pain management. Pain Manag. 2018;8(2):95–104. doi:https://doi.org/10.2217/pmt-2017-0050
Lagu T, Hannon NS, Rothberg MB, Lindenauer PK. Patients’ evaluations of health care providers in the era of social networking: an analysis of physician-rating websites. J Gen Intern Med. 2010;25(9):942–946. doi:https://doi.org/10.1007/s11606-010-1383-0
Lagu T, Metayer K, Moran M, et al. Website characteristics and physician reviews on commercial physician-rating websites. JAMA. 2017;317(7):766–768. doi:https://doi.org/10.1001/jama.2016.18553
Merchant RM, Asch DA. Protecting the value of medical science in the age of social media and “fake news.” JAMA. November 2018. doi:https://doi.org/10.1001/jama.2018.18416
Agarwal AK, Pelullo AP, Merchant RM. “Told”: the word most correlated to negative online hospital reviews. J Gen Intern Med. February 2019. doi:https://doi.org/10.1007/s11606-019-04870-6
Stronger Than Addiction. https://www.shatterproof.org/. Accessed September 19, 2019.
Shatterproof to Build a Rating System for Addiction Treatment Programs. Shatterproof. https://www.shatterproof.org/press/shatterproof-build-rating-system-addiction-treatment-programs. Accessed September 19, 2019.
This study was financially supported by the Pennsylvania Department of Health – Commonwealth Universal Research Enhancement Program (CURE).
Conflict of Interest
The authors declare that they do not have a conflict of interest.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Agarwal, A.K., Wong, V., Pelullo, A.M. et al. Online Reviews of Specialized Drug Treatment Facilities—Identifying Potential Drivers of High and Low Patient Satisfaction. J GEN INTERN MED 35, 1647–1653 (2020). https://doi.org/10.1007/s11606-019-05548-9
- substance abuse
- treatment centers
- social media
- patient experience