Advertisement

Journal of General Internal Medicine

, Volume 33, Issue 4, pp 423–428 | Cite as

Keystrokes, Mouse Clicks, and Gazing at the Computer: How Physician Interaction with the EHR Affects Patient Participation

  • Richard L. StreetJr
  • Lin Liu
  • Neil J. Farber
  • Yunan Chen
  • Alan Calvitti
  • Nadir Weibel
  • Mark T. Gabuzda
  • Kristin Bell
  • Barbara Gray
  • Steven Rick
  • Shazia Ashfaq
  • Zia Agha
Original Research

Abstract

Background

Evidence is mixed regarding how physicians' use of the electronic health record (EHR) affects communication in medical encounters.

Objective

To investigate whether the different ways physicians interact with the computer (mouse clicks, key strokes, and gaze) vary in their effects on patient participation in the consultation, physicians’ efforts to facilitate patient involvement, and silence.

Design

Cross-sectional, observational study of video and event recordings of primary care and specialty consultations.

Participants

Thirty-two physicians and 217 patients.

Main Measures

Predictor variables included measures of physician interaction with the EHR (mouse clicks, key strokes, gaze). Outcome measures included active patient participation (asking questions, stating preferences, expressing concerns), physician facilitation of patient involvement (partnership-building and supportive talk), and silence.

Key Results

Patients were less active participants in consultations in which physicians engaged in more keyboard activity (b = −0.002, SE = 0.001, p = 0.02). More physician gaze at the computer was associated with more silence in the encounter (b = 0.21, SE = 0.09, p = 0.02). Physicians’ facilitative communication, which predicted more active patient participation (b = 0.65, SE = 0.14, p < 0.001), was not related to EHR activity measures.

Conclusions

Patients may be more reluctant to actively participate in medical encounters when physicians are more physically engaged with the computer (e.g., keyboard activity) than when their behavior is less demonstrative (e.g., gazing at EHR). Using easy to deploy communication tactics (e.g., asking about a patient’s thoughts and concerns, social conversation) while working on the computer can help physicians engage patients as well as maintain conversational flow.

INTRODUCTION

The electronic health record (EHR) represents an additional participant in the medical consultation. While historically clinicians have reviewed and written notes in the paper chart during medical visits, interaction with the EHR is different in both degree and type. Through gaze, key strokes, and mouse clicks, clinicians enter data, tick boxes, switch tabs, and respond to information (e.g., notes, prompts, window events) that pop up on the screen. Previous research has focused on the effects of EHR use on how physicians communicate with patients and the potential consequences on quality health care. Early conventional wisdom assumed that working with the computer during consultations detracted from effective, patient-centered communication because the physician’s attention was directed to the EHR rather than to the patient. Indeed, some research indicates the more physicians use the EHR, the less likely they are to explore the patient’s agenda or ask about psychosocial issues.1 , 2 Gaze disproportionately directed to the computer rather than to the patient has been associated with less patient satisfaction,3 observer ratings of less effective physician communication,4 and poorer understanding of patient concerns.5

On the other hand, other research paints a more complicated picture. Some studies have found that the computer has little effect on patient satisfaction or communication about medical issues6 and that patients accept physician use of computers as just part of the doctor’s job.7 Patients of more experienced physicians have reported that the computer had less interpersonal impact in the consultation than did patients seeing more inexperienced physicians (e.g., residents).8 Another investigation found that more communicatively skilled physicians used the EHR as a resource to facilitate physician-patient communication, whereas less communicatively skilled doctors saw the computer as a detriment.9 Two recent reviews concluded that the EHR can facilitate biomedical information exchange in medical encounters, but may have a negative effect on collecting psychosocial information from the patient.10 , 11

While a number of studies have analyzed the effects of EHR activity on physicians’ communication, this investigation extends existing research in two respects. First, we examined how the physician’s use of the computer affects patient participation in the medical encounter, particularly with respect to the degree to which patients ask questions, express concerns, state preferences, and introduce topics to discuss. Patient involvement in clinical encounters is not only a cornerstone of patient-centered care10 and shared decision-making,12 , 13 it also can contribute to improved consultation outcomes such as more personalized treatment recommendations,14 better physician understanding of patient’s beliefs,15 and stronger patient commitment to treatment.16 , 17

On the one hand, physician interaction with the EHR may inhibit active patient participation because the patient may not want to disturb the doctor while he or she is reviewing and entering information in the computer. On the other hand, if the physician is not talking when using the EHR, the patient may use this as an opportunity to raise concerns or questions. Previous studies have produced mixed findings. For example, more physician computer activity has been associated with less patient participation,18 whereas another study found no effect of EHR use on the degree to which patients asked questions or expressed concerns.19 Two other investigations reported that some, but not all, patients use the physician’s silence when working on the computer as an opportunity to speak.9 , 20

Second, because physicians interact with the EHR in different ways (e.g., gaze, keyboard activity, mouse clicks and scrolls), this investigation also examined whether these behaviors have differential effects on communication in the consultation.21 For example, does typing on the keyboard influence patient participation differently than simply looking at the screen? Does the way physicians interact with the EHR affect their use of facilitative communication such as partnership-building (e.g., soliciting the patient’s agenda) and supportive talk (e.g., reassurance, encouragement)? These behaviors can effectively prompt more patient involvement in medical encounters.17 , 22 25 If EHR use is associated with less effort to solicit the patient’s agenda or questions, then computer activity could indirectly lessen patient involvement through its effect on physician communication.

METHODS

Research Setting and Participants

Physicians and patients were recruited from the VA San Diego Healthcare System and UCSD Healthcare. Physicians were recruited by email invitation and by publicizing the study at clinic meetings. We recruited 32 physicians across primary care, gastroenterology, pulmonology, cardiology, rheumatology, and nephrology at 5 clinics (3 VA, 2 UCSD). Patients of participating physicians were randomly recruited prior to their scheduled appointments. Physician and patient participants agreed to have their consultations video-recorded and EHR activity tracked. The research was approved by the VA San Diego Healthcare System and the UCSD Healthcare IRBs.

EHR Activity Measures

Physician interaction with the EHR was assessed via specialized usability software. EHR activity during the visit was captured using MORAE to track mouse and keyboard activity (masking passwords) and display video. EHR display video enables reading text and manual coding of EHR activities. MORAE allows annotation of the mouse clickstream by highlighting in a spreadsheet and in the replay of EHR display video. Video and audio streams are also captured via webcam and combined with EHR activity in MORAE.26

EHR mouse and key clicks were coded to discrete signals as point-event streams with a single timestamp per event (e.g., mouse clickstream, keystrokes) and interval-event streams based on two timestamps representing onset- and offset- (e.g., while a person is speaking or non-verbal eye-gaze event). Point events can be counted, while interval event time can be totaled, as each spans a definite time duration. Coding schemas were developed to contextualize events to clinical workflow (e.g., orders, medication review, note taking). Physicians’ gaze (at computer, at patient, elsewhere) was coded based on video review. A start point and end point to different gaze states provided an interval stream. To assess quality, a second human coder independently coded a randomly selected 10% subsample of visits.

Communication Measures

The patient’s active participation was coded using the Active Patient Participation Coding System (APPC), which has been validated in previous studies.22 , 23 , 27 31 Active patient participation includes three types of utterances—asking questions, assertive comments (e.g., making a request, stating preferences), and expressions of concern (e.g., worry, fear). These are considered ‘active’ forms of communication because they explicitly interject the patient’s perspective into the conversation and influence physician behavior. The unit of analysis for coding was the utterance, the oral analog of a simple sentence.30 , 32 The measure is scored as the number of the patient’s active participation utterances during the interaction.

Two types of communication behaviors were coded as physicians’ facilitative communication, partnership-building (e.g., asking about a patient’s concerns, soliciting questions) and supportive talk (e.g., reassurance, empathy, encouragement). These behaviors were also coded using the APPC coding system and represent how often a physician produced utterances aimed at facilitating patient participation.

We also measured silence, defined as the amount of time in the consultation when neither patient nor physician was talking. We included this variable because of research indicating that more physician EHR activity is associated with more silence33 and thus less verbal communicative activity.

For the APPC coding, four undergraduate research assistants participated in four 2-h training sessions. Using InqScribe event recording technology, coders independently listened to audio-recordings of the consultations and marked (time stamped) where a behavior of interest occurred. The output of the coding was a list of coded behaviors, the frequency of which was summed to create a score for each interactant/consultation (e.g., number of active patient participation behaviors; the frequency of physician facilitative responses). To assess reliability, 15% of the visits (n = 33) were coded by more than one coder with agreement assessed by intraclass correlation coefficients (ICCs = 0.89 and 0.75 for active patient participation and physician facilitative communication, respectively).

Coders used Chronoviz software to code sound-silence sequences. The coding process involved listening to the audio-recording, focusing on one interactant (e.g., the patient), and pressing a key on the keyboard when the interactant started talking and releasing it when the interactant stopped. This was continued over the course of the interaction creating a sequence of 1 s (talking) and 0 s (not talking) every 500 ms. After completing coding for one interactant, the coder would listen to the audio-recording again and repeat the procedure for the other interactant (e.g., the physician). The two data streams were overlaid on one another to create time intervals associated with four states—physician talking, patient talking, both talking simultaneously, and neither talking (silence).4

Data Analysis

Descriptive statistics were used to summarize patient and provider characteristics (mean, standard deviation, median and range for continuous variables, frequency and percentage for categorical variables) at the visit level as well at the provider level. We assessed the relationships of the three communication measures with visit length, number of EHR mouse clicks and wheels, number of EHR keyboard clicks, and duration of physician gaze at the EHR during the visit.

To account for a cluster effect of patients nested within physicians, linear mixed effect models were used. The bivariate association analysis was performed to study the association between each physician interaction with the EHR variable and communication outcomes. Since visit length was a potential confounder, we controlled visit length in the analyses. We also examined the bivariate association between each patient and provider characteristic and outcomes. Variables that were significant at p < 0.15 in bivariate association analysis were included in the multivariable analysis and variables with p < 0.10 were kept in the final models. For active patient communication, we also included physician facilitative communication as a potential covariate in the multivariable model. Normal assumption of residuals in linear mixed effects model was examined using normal probability plot. All analyses were performed using the statistical software R.

RESULTS

Patient and Provider Characteristics

The final sample consisted of 217 visits from 32 physicians. Patient and physician characteristics are listed in Table 1, at the visit level and physician level. Among these visits, 53% were from UCSD and 47% from VASD. The average patient age was 60.6 years (SD = 16.7), 65.9% were male patients, most patients were Caucasians (73.8%), and 10% were African Americans. Specialist care visits made up 43% of the sample. The 32 physicians had an average of 11.1 (SD = 7.74) years of experience with an average of 7.67 (SD = 3.63) years of experience with EHR, with more male physicians (59.4%) and slightly more primary care physicians (53.1%).
Table 1

Patient and Provider Characteristics

Visit characteristics

N = 217

 

Site

n

%

  UCSD

115

53.0

  VASD

102

47.0

Patients

 Age (years)

  Mean (SD)

60.6 (16.7)

 

  Median (IQR)

63 (52–71)

 

 Gender

n

%

  Male

143

65.9

  Female

74

34.1

 Race and ethnicity

  

  Caucasian

149

73.8

  African American

21

10.4

  Others

32

15.8

Provider

  

 Years in institution

  

  Mean (SD)

11.6 (7.83)

 

  Median (Q1, Q3)

11 (5–17)

 

 Years with EMR

  

  Mean (SD)

7.93 (3.67)

 

  Median (Q1, Q3)

8 (5–11)

 

 Gender

n

%

  Male

125

57.6

  Female

92

42.4

 Specialty

  Primary care

123

56.7

  Specialist

94

43.3

Provider characteristics

n = 32

 

Years in institution

  Mean (SD)

11.1 (7.94)

 

  Median (Q1, Q3)

8 (5–16.5)

 

Years with EMR

  Mean (SD)

7.67 (3.63)

 

  Median (Q1, Q3)

8 (5–9)

 

Gender

n

%

  Male

19

59.4

  Female

13

40.6

Specialty

  Primary care

17

53.1

  Specialist

15

46.9

Site

  UCSD

15

46.9

  VASD

17

53.1

(Q1, Q3): the first and third quartiles

Visit Length, Physician Interaction with EHR, and Communication Variables

The mean visit length was 20.3 min (SD = 10.5). On average, each visit had 216 (SD = 174) mouse clicks or wheels and 729 (SD = 768) keyboard clicks. Physicians spent an average of 8.9 (SD = 6.32) min gazing at EHR (Table 2). Mouse clicks, gaze at the EHR, and keystrokes were moderately correlated with one another (range, 0.46 to 0.58). Per consultation, patients averaged just over eight (SD = 5.62) active participation behaviors, and physicians used facilitative verbal communication an average of 4.28 (SD = 3.25) times. On average, each visit had 5.72 (SD = 4.09) min of silence (Table 2).
Table 2

Physician EHR Usage, Gaze Time at EHR, and Communication Outcomes

 

Mean (SD)

Median (Q1, Q3)

Visit length (min)

20.3 (10.5)

18.4 (13.1–24.9)

Physician EHR activity

 EHR mouse click/wheels count

216 (174)

172 (92–297)

 Keyboard key strokes

729 (768)

449 (109–1136)

 Gaze time at EHR (min)

8.9 (6.32)

7.44 (4.98–11.3)

Communication

 Active patient communication

8.27 (5.62)

8 (4–11)

 Physician facilitative communication

4.28 (3.25)

4 (2–6)

 Silence (min)

5.72 (4.09)

4.7 (2.7–7.47)

Active Patient Participation

Controlling for visit length, more keyboard clicks (b = −0.002, p = 0.02) were associated with less active patient communication, whereas longer visits (b = 0.30, p < 0.001) were associated with more active patient participation. In multivariable analysis (see Table 3), we found that keyboard key strokes and visit length were still significant. We also found that provider facilitative communication (b = 0.65, p < 0.001) was predictive of active patient communication.
Table 3

Multivariable Analysis for Association with Communication Outcomes

 

Multivariable mixed effects model

 

Coefficients (b)

Standard error (SE)

P-value

Active patient communication

 Keyboard key strokes

−0.002

0.001

0.02

 Visit length

0.33

0.05

< 0.001

 Provider facilitative communication

0.65

0.14

< 0.001

Provider facilitative communication

 Visit length

0.08

0.02

< 0.001

 Site (VASD vs. UCSD)

2.75

0.69

< 0.001

 Patient age

0.02

0.01

0.06

Silence

 Gaze time at EHR

0.21

0.09

0.02

 Visit length

0.18

0.05

< 0.001

Physicians’ Facilitative Communication

In bivariate analyses, physicians’ facilitative communication was more frequent at the VASD (b = 3.13 for VASD vs. UCSD, p = 0.001), with female patients (b = 1.07, p = 0.046), and with older patients (b = 0.03, p = 0.04). Controlling for visit length, we did not see any significant associations among mouse clicks, keystrokes, or gaze time and physicians’ facilitative verbal communication. In multivariable analysis, visit length (b = 0.08, p < 0.001), site (b = 2.75 for VASD vs. UCSD, p < 0.001), and patient age (b = 0.02, p = 0.06) were still significant predictors of physicians’ facilitative communication (Table 3).

Silence

In bivariate analyses, VASD (b = 2.25 for VASD vs. UCSD, p = 0.03) visits were found to have more silence. Longer visit length (b = 0.28, p < 0.001) was significantly associated with more silence, and controlling for visit length, more gaze time at EHR (b = 0.21, p = 0.02) was associated with more silence. In multivariable analysis, visit length (b = 0.18, p < 0.001) and gaze time (b = 0.21, p < 0.02) were still significant.

DISCUSSION

This investigation examined whether the different ways physicians interact with the EHR (gaze, key strokes, mouse clicks) affect patient participation in the encounter, physicians’ use of communication that facilitates patient involvement, and silence. Several findings were noteworthy and have important implications for future research and clinical practice.

First, although the EHR activity variables were moderately correlated with one another, each had differential effects on the communication. More key stokes predicted lower levels of active patient participation behaviors (e.g., asking questions, expressing concerns). Gaze at the computer was associated with more silence during the encounter, which in turn predicted less patient involvement. While the evidence is mixed regarding whether physician EHR activity affects the physician-patient relationship,10 our findings suggest that how physicians interact with the computer can have differential effects on flow of the conversation in ways that may limit patient participation.33

Of the study variables assessing physician computer activity, keystrokes are arguably the most physically ‘active’ form of interaction (hand, arm, and finger movements) compared to gaze and mouse clicks. When physicians are typing, patients may not want to interrupt the doctor’s work.34 In addition, physicians may stop talking when looking at the information on the screen. Some patients, in turn, may remain quiet so as not to disturb the physician, thus contributing to mutual silence.18 Our findings are somewhat inconsistent with those of Noordom et al.,19 who found no significant effects of different types of EHR use on physician and patient communicative activity. However, that study coded physician computer activity as ‘yes’ or ‘no’ with respect to specific computer tasks (used the computer to search or read something, while talking to the patient, to prescribe something, etc.), not the degree of different kinds of EHR activity, which is what the present investigation examined.

Second, consistent with existing research,22 , 23 , 35 physicians’ use of facilitative communication (e.g., partnership-building, supportive talk) was associated with more active patient participation (even when controlling for visit length). However, the physicians’ EHR activity did not predict their degree of facilitative communication. The effects of physician interaction with EHR on communication in the consultation likely depend on the physician’s communication skills, experience, and multitasking abilities.9 , 10 Some doctors have internalized conversational routines that maintain the flow of the conversation even when interacting with the EHR (e.g., small talk, asking the patient a question).34 These routines also may include simple facilitative responses such as “any other concerns?” and “how have things been going otherwise?” while simultaneously working with the computer. Although physician gaze at the computer has predicted less patient satisfaction3 and, in this investigation, more silence, silence can also function as a facilitator of patient participation.20 , 36 Thus, when patients use silence as an opportunity to speak, physicians may need to stop computer activity, look at the patient, and focus on what he or she is saying.37

Finally, future research should explore what types of EHR screen content could be used to facilitate communication. For example, charts, graphs, and images in the EHR lend themselves to screen sharing that could be used for patient education and shared decision-making.20 , 37 Also, given that important issues clinicians and patients discuss in the consultation often are not documented in the EHR,38 future research should examine what effects on data entry into the EHR are associated with whether clinicians enter data during the consultation or wait to do so when the patient has left.

Practice Implications

Patient participation in medical care conversations is fundamental to the delivery of patient-centered care. While patients generally accept the normalcy of physician EHR use in medical visits, this investigation demonstrates that some of the ways physicians interact with the EHR could make patients reluctant to express concerns, ask questions, or talk as the doctor is typing or looking at the screen. Models exist for teaching clinicians communicative practices and consultation management techniques that both maintain engagement with the patient as well enable task completion in the EHR.37 One strategy is sign posting where the clinician tells the patients what they are doing (e.g., “let me look at these lab figures here,” “I am going to make on note of that in here”). A second strategy is to use simple, but powerful partnership-building responses that prompt patients to talk (e.g., “anything else going on?” “Other concerns?”).38 Lastly, as noted earlier, screen sharing may be a means for physicians not only to work on EHR tasks, but also to share information with patients in ways that also enhance patient engagement.

Limitations and Conclusions

In terms of limitations, this was a cross-sectional, observational investigation that did not collect outcome data (e.g., patient satisfaction, adherence) or assess other communication processes that are essential to quality medical care (e.g., quality of information exchange, shared decision-making). Future research should continue to explore these issues as well as how EHR activity and elements of clinician-patient communication have meaning and value from the perspective of patients. Moreover, our EHR activity and communication measures were computed at the interaction level and not within specific time intervals. Thus, our findings represent correlational and not sequential patterns. Time stamped coding, coupled with time series regression39 or lag sequential analyses,36 would allow future research to identify whether a type of EHR activity (e.g., keyboard activity) at a specific time point was subsequently followed by certain communicative actions (e.g., patient silence, social conversation). Limitations notwithstanding, this study demonstrates that the multidimensionality of physician EHR activity can have differential effects on patients’ communication, particularly regarding behaviors that shed light on the patient’s concerns and informational needs. Future research should further explore the effects of different ways physicians nonverbally interact with the computer as well as attend to co-occurring verbal behaviors physicians use to engage the patient and maintain the flow of conversation.

Notes

ACKNOWLEDGEMENTS

This work was supported by the Agency for Healthcare Research and Quality R01HS021290. This material is the result of work supported with resources of the VA San Diego Healthcare System. The contents do not represent the views of the US Department of Veterans Affairs or the United States Government. Richard Street also is supported by the Houston VA Center for Innovations in Quality, Effectiveness and Safety (CIN 13-413).

Compliance with Ethical Standards

Conflict of Interest

The authors declare that they do not have a conflict of interest.

REFERENCES

  1. 1.
    Makoul G, Curry RH, Tang PC. The use of electronic medical records: communication patterns in outpatient encounters. J Am Med Inform Assoc. 2001;8(6):610-5.CrossRefPubMedPubMedCentralGoogle Scholar
  2. 2.
    Margalit RS, Roter D, Dunevant MA, Larson S, Reis S. Electronic medical record use and physician-patient communication: an observational study of Israeli primary care encounters. Patient Educ Couns. 2006;61(1):134-41.CrossRefPubMedGoogle Scholar
  3. 3.
    Farber NJ, Liu L, Chen Y et al. EHR use and patient satisfaction: what we learned. J Fam Pract. 2015;64(11):687-96.PubMedGoogle Scholar
  4. 4.
    Street RL, Jr., Liu L, Farber NJ et al. Provider interaction with the electronic health record: the effects on patient-centered communication in medical encounters. Patient Educ Couns. 2014;96(3):315-9.CrossRefPubMedPubMedCentralGoogle Scholar
  5. 5.
    Bensing JM, Kerssens JJ, van der Pasch M. Patient-directed gaze as a tool for discovering and handling psychosocial problems in general practice. J Nonverbal Behav. 1995;19:223-42.CrossRefGoogle Scholar
  6. 6.
    Hsu J, Huang J, Fung V, Robertson N, Jimison H, Frankel R. Health information technology and physician-patient interactions: impact of computers on communication during outpatient primary care visits. J Am Med Inform Assoc. 2005;12(4):474-80.CrossRefPubMedPubMedCentralGoogle Scholar
  7. 7.
    Brownbridge G, Herzmark GA, Wall TD. Patient reactions to doctors' computer use in general practice consultations. Soc Sci Med. 1985;20(1):47-52.CrossRefPubMedGoogle Scholar
  8. 8.
    Rouf E, Whittle J, Lu N, Schwartz MD. Computers in the exam room: differences in physician-patient interaction may be due to physician experience. J Gen Intern Med. 2007;22(1):43-8.CrossRefPubMedPubMedCentralGoogle Scholar
  9. 9.
    Frankel R, Altschuler A, George S et al. Effects of exam-room computing on clinician-patient communication: a longitudinal qualitative study. J Gen Intern Med. 2005;20(8):677-82.CrossRefPubMedPubMedCentralGoogle Scholar
  10. 10.
    Shachak A, Reis S. The impact of electronic medical records on patient-doctor communication during consultation: a narrative literature review. J Eval Clin Pract. 2009;15(4):641-9.CrossRefPubMedGoogle Scholar
  11. 11.
    Rathert C, Mittler JN, Banerjee S, McDaniel J. Patient-centered communication in the era of electronic health records: what does the evidence say? Patient Educ Couns. 2017;100(1):50-64.CrossRefPubMedGoogle Scholar
  12. 12.
    Charles C, Gafni A, Whelan T. Shared decision-making in the medical encounter: what does it mean? (or it takes at least two to tango). Soc Sci Med. 1997;44(5):681-92.CrossRefPubMedGoogle Scholar
  13. 13.
    Epstein RM, Street RL Jr. Patient-centered communication in cancer care: promoting healing and reducing suffering. Bethesda, MD: National Cancer Institute; 2007. Report No.: NIH Publication No. 07-6225.Google Scholar
  14. 14.
    Golin CE, DiMatteo MR, Gelberg L. The role of patient participation in the doctor visit: implications for adherence to diabetes care. Diabetes Care. 1996;19(10):1153-64.CrossRefPubMedGoogle Scholar
  15. 15.
    Street RL, Jr., Haidet P. How well do doctors know their patients? factors affecting physician understanding of patients' health beliefs. J Gen Intern Med. 2011 26(1):21-7.CrossRefPubMedGoogle Scholar
  16. 16.
    Loh A, Leonhart R, Wills CE, Simon D, Harter M. The impact of patient participation on adherence and clinical outcome in primary care of depression. Patient Educ Couns. 2007;65(1):69-78.CrossRefPubMedGoogle Scholar
  17. 17.
    Parchman ML, Zeber JE, Palmer RF. Participatory decision making, patient activation, medication adherence, and intermediate clinical outcomes in type 2 diabetes: a STARNet study. Ann Fam Med. 2010;8(5):410-7.CrossRefPubMedPubMedCentralGoogle Scholar
  18. 18.
    Gibbings-Isaac D, Iqbal M, Tahir MA, Kumarapeli P, de Lusignan S. The pattern of silent time in the clinical consultation: an observational multichannel video study. Fam Pract. 2012;29(5):616-21.CrossRefPubMedGoogle Scholar
  19. 19.
    Noordman J, Verhaak P, van Beljouw I, van Dulman S. Consulting room computers and their effect on general practitioner-patient communication. Fam Pract. 2010;27(6):644-51.CrossRefPubMedGoogle Scholar
  20. 20.
    Ratanawongsa N, Matta GY, Lyles CR et al. Multitasking and silent electronic health record use in ambulatory visits. JAMA Intern Med. 2017;177(9):1382-5..CrossRefPubMedGoogle Scholar
  21. 21.
    Calvitti A, Hochheiser H, Ashfaq S et al. Physician activity during outpatient visits and subjective workload. J Biomed Inform. 2017;69:135-49.CrossRefPubMedGoogle Scholar
  22. 22.
    Street RL, Jr., Gordon HS, Ward MM, Krupat E, Kravitz RL. Patient participation in medical consultations: why some patients are more involved than others. Med Care. 2005;43(10):960-9.CrossRefPubMedGoogle Scholar
  23. 23.
    Zandbelt LC, Smets EM, Oort FJ, Godfried MH, de Haes HC. Patient participation in the medical specialist encounter: does physicians' patient-centred communication matter? Patient Educ Couns. 2007;65(3):396-406.CrossRefPubMedGoogle Scholar
  24. 24.
    Arora NK. Interacting with cancer patients: the significance of physicians' communication behavior. Soc Sci Med. 2003;57(5):791-806.CrossRefPubMedGoogle Scholar
  25. 25.
    Martin LR, Jahng KH, Golin CE, DiMatteo MR. Physician facilitation of patient involvement in care: correspondence between patient and observer reports. Behav Med. 2003;28(4):159-64.CrossRefPubMedGoogle Scholar
  26. 26.
    Weibel N, Rick S, Emmeneger C, Ashfaq S, Agha Z. LAB-IN-A-BOX: Semi-automatic tracking of activity in the medical office. Personal and Ubiquitous Computing. 2015;19(2):317-34.CrossRefGoogle Scholar
  27. 27.
    Ward MM, Sundaramurthy S, Lotstein D, Bush TM, Neuwelt CM, Street RL, Jr. Participatory patient-physician communication and morbidity in patients with systemic lupus erythematosus. Arthritis Rheum. 2003;49(6):810-8.CrossRefPubMedGoogle Scholar
  28. 28.
    Eggly S, Penner LA, Greene M, Harper FW, Ruckdeschel JC, Albrecht TL. Information seeking during "bad news" oncology interactions: question asking by patients and their companions. Soc Sci Med. 2006;63(11):2974-85.CrossRefPubMedGoogle Scholar
  29. 29.
    Gordon HS, Street RL, Jr., Sharf BF, Souchek J. Racial differences in doctors' information-giving and patients' participation. Cancer 2006;107(6):1313-20.CrossRefPubMedGoogle Scholar
  30. 30.
    Street RL, Jr., Millay B. Analyzing patient participation in medical encounters. Health Commun. 2001;13(1):61-73.CrossRefPubMedGoogle Scholar
  31. 31.
    Street RL Jr., Gordon HS. Companion participation in cancer consultations. Psychooncology. 2008;17(3):244-51.CrossRefPubMedGoogle Scholar
  32. 32.
    Roter D, Larson S. The Roter interaction analysis system (RIAS): utility and flexibility for analysis of medical interactions. Patient Educ Couns. 2002;46(4):243-51.CrossRefPubMedGoogle Scholar
  33. 33.
    Dowell A, Stubbe M, Scott-Dowell K, Macdonald L, Dew K. Talking with the alien: interaction with computers in the GP consultation. Aust J Prim Health. 2013;19(4):275-82.CrossRefPubMedGoogle Scholar
  34. 34.
    Booth N, Robinson P, Kohannejad J. Identification of high-quality consultation practice in primary care: the effects of computer use on doctor-patient rapport. Inform Prim Care. 2004;12(2):75-83.PubMedGoogle Scholar
  35. 35.
    Schmid MM, Hall JA, Roter DL. Caring and dominance affect participants' perceptions and behaviors during a virtual medical visit. J Gen Intern Med. 2008;23(5):523-7.CrossRefGoogle Scholar
  36. 36.
    Eide H, Quera V, Graugaard P, Finset A. Physician-patient dialogue surrounding patients' expression of concern: applying sequence analysis to RIAS. Soc Sci Med. 2004;59(1):145-55.CrossRefPubMedGoogle Scholar
  37. 37.
    Duke P, Frankel RM, Reis S. How to integrate the electronic health record and patient-centered communication into the medical visit: a skills-based approach. Teach Learn Med. 2013;25(4):358-65.CrossRefPubMedGoogle Scholar
  38. 38.
    Rosenthal DI. Instant replay. Healthc (Amst). 2013;1:52-4.CrossRefPubMedGoogle Scholar
  39. 39.
    Street RL. Jr., D. Buller. Patients' characteristics affecting physician-patient nonverbal communication. Human Commun Res. 1988; 15(1), 60-90.CrossRefGoogle Scholar

Copyright information

© Society of General Internal Medicine 2017

Authors and Affiliations

  • Richard L. StreetJr
    • 1
    • 2
    • 3
  • Lin Liu
    • 4
    • 5
  • Neil J. Farber
    • 4
  • Yunan Chen
    • 6
  • Alan Calvitti
    • 5
  • Nadir Weibel
    • 4
  • Mark T. Gabuzda
    • 5
  • Kristin Bell
    • 5
  • Barbara Gray
    • 5
  • Steven Rick
    • 5
  • Shazia Ashfaq
    • 5
  • Zia Agha
    • 4
    • 7
  1. 1.From Department of Communication Texas A&M UniversityCollege StationUSA
  2. 2.Department of MedicineBaylor College of MedicineHoustonUSA
  3. 3.Houston VA Center for Innovations in Quality, Effectiveness and SafetyHoustonUSA
  4. 4.From University of California San DiegoSan DiegoUSA
  5. 5.VA San Diego Healthcare SystemSan DiegoUSA
  6. 6.University of California IrvineIrvineUSA
  7. 7.West HealthSan DiegoUSA

Personalised recommendations