This survey (available as Supplemental Material) was jointly designed by HealthiVibe, LLC (Arlington, VA, USA), Epstein Health, Trevena, Inc. (the study sponsor), and investigators with content expertise (PGW and TJG). The survey received institutional review board approval and is reported in a manner consistent with the CHERRIES guidelines [23, 24]. This article does not contain any studies with human participants or animals.
US-based physicians participated in an online survey using a Health Insurance Portability and Accountability Act–compliant web survey platform (Amazon Web Services). Participants were identified from an opt-in healthcare database built from physician communities, continuing medical education sites, and clinical content portals. Invitations to participate were sent via e-mail, which included a link to the survey website with a disclaimer that the results would be kept confidential. To achieve the desired response rate, the first invitation e-mail was followed by a second invitation, which was sent within 7 days of the first. Before participating in the survey, all participants provided consent via completion of a screening item asking for their agreement to participate. Each respondent could submit only one completed survey. Upon completion, survey respondents received a confirmatory “thank you” note via e-mail. Respondents also received modest compensation for completing the survey in its entirety. This was considered necessary for participant recruitment and to encourage completion of the entire survey. Neither the survey nor the invitation included any reference to the sponsor.
There was a short screening process, expected to take no more than 5 min to complete, during which physicians responded to questions regarding their willingness to participate in the core survey content and to confirm that they were eligible according to the inclusion/exclusion criteria. To be eligible, physicians must have practiced within one of the required medical specialties: surgery, anesthesiology, critical care/emergency medicine, or hospitalist. Surgeons must have completed ≥ 1 of 27 listed surgical procedures in six surgical categories (general or bariatric, cardiothoracic, colorectal, orthopedic or neurosurgery, plastic or cosmetic, or vascular). At the time of this survey, all physicians must have currently be practicing in an academic medical center, community hospital, or ambulatory care setting, and involved in one or more surgeries per month (e.g., surgeons must have been performing surgery, anesthesiologists must have been managing administration of IV pain medications). Surgical categories were chosen based on a clinical assessment of patient types and surgical procedures that were most likely to lead to moderate-to-severe pain and require IV analgesia medications following surgery.
The 47-question survey was designed to take an estimated 12–15 min to complete, incorporating screening, demographic, and core survey questions. The survey contained 40 screens: a landing page, 38 individual question screens, and a “thank you” page. Following each answer, the respondent was directed to the next question. Respondents were unable to go back and change answers in order to avoid creating survey bias (as future questions could potentially influence respondents to change their previous answers). Respondents could not move forward in the survey without providing an answer to each question; in this way, unanswered questions were not permitted.
The survey contained questions elucidating the types of medication that physicians prescribe to manage acute postoperative pain, the factors that influence their postsurgical pain medication regimens, the challenges they face when treating acute postoperative pain, and their perceptions of unmet needs regarding current pain therapies. The survey also included demographic questions including physicians’ primary practice settings, years in clinical practice, and geographic locations.
The survey questions were presented in the following formats: questions that asked the respondent to choose one response from a defined list of possible statements; questions that asked the respondent to choose multiple responses (e.g., “Select all that apply”); statements where the respondent was asked to indicate agreement or disagreement using a Likert scale of 1–5 (1 = “Strongly disagree” to 5 = “Strongly agree”); dichotomous questions (yes/no); and statements where the respondent is asked to indicate the frequency with which they engage in an activity (1 = Very often, 2 = Often, 3 = Sometimes, 4 = Never, 5 = Not sure/not applicable). The presentation of response statements within a question set was rotated in an attempt to reduce bias.
Each question included in the survey was designed to gain a specific insight that was intended to gain an overall understanding of current practice patterns and treatment challenges experienced by physicians managing patients’ acute postoperative pain.
When designing the study, the key outcome of interest was the identification of factors influencing physicians’ choice of IV pain medications. This was specifically addressed in two survey items (see Supplemental Material)—“Please select the top three IV pain medications that you are most likely to prescribe for your patients who experience moderate-to-severe pain immediately following surgery” and “What clinical practice-related factors most determine which IV pain medications you prescribe for postsurgical patients you care for during their hospitalization?”
The supportive outcomes of the study were to determine the factors that influenced practice patterns and the most pressing unmet needs for physicians managing acute postoperative pain, which were addressed with several questions.
Personally identifiable information was not collected as part of the survey and all survey data were collected, de-identified, and stored by the third-party platform. The host platform passed an internal identifier (along with security tokens) to the survey. Once a survey was completed, this status and the internal identifier was communicated to the panel platform. Similarly, if a respondent was disqualified from the survey, this was also communicated to the panel platform. Access to the survey was therefore restricted at the platform level. The platform identifier was stored with the survey results while the survey was live, and used to verify that no duplicate completions existed in the data.
The data cleaning tool within the survey platform was used to identify “speedy” responses (≤ 3 s per question), which were quarantined and excluded from the sample. The host platform only recorded visitors who completed ≥ 1 question and did not provide data to calculate the view rate, participation rate, or completion rate.
The inclusion of approximately 500 physicians was considered a feasible and adequate sample size that could accurately reflect the current views among treating physicians. During the recruitment process, invitations targeted surgeons, anesthesiologists, and critical care/emergency medicine physicians or hospitalists, at proportions of 6:3:1. This reflected the breadth of physicians who commonly participate in the management of moderate-to-severe acute postoperative pain requiring IV analgesic medications. The eligibility of the respondents was checked at the time of initial screening, so that only those meeting the criteria could complete the survey.
The aim of this study was to identify the current practice patterns and treatment challenges. Consequently, descriptive statistics were used to present and interpret the data.