Advertisement

Understanding interactions in face-to-face and remote undergraduate science laboratories: a literature review

  • Jianye WeiEmail author
  • David F. Treagust
  • Mauro Mocerino
  • Anthony D. Lucey
  • Marjan G. Zadnik
  • Euan D. Lindsay
Open Access
Review
  • 134 Downloads

Abstract

This paper reviews the ways in which interactions have been studied, and the findings of such studies, in science education in both face-to-face and remote laboratories. Guided by a systematic selection process, 27 directly relevant articles were analysed based on three categories: the instruments used for measuring interactions, the research findings on student interactions, and the theoretical frameworks used in the studies of student interactions. In face-to-face laboratories, instruments for measuring interactions and the characterisation of the nature of interactions were prominent. For remote laboratories, the analysis of direct interactions was found to be lacking. Instead, studies of remote laboratories were mainly concerned with their practical scope. In addition, it is found that only a limited number of theoretical frameworks have been developed and applied in the research design. Existent theories are summarised and possible theoretical frameworks that may be implemented in studies of interactions in undergraduate laboratories are proposed. Finally, future directions for research on the inter-relationship between student interactions and laboratory learning are suggested.

Keywords

Undergraduate learning Undergraduate laboratories Face-to-face laboratories Remote laboratories 

Abbreviations

CLEI

The Chemistry Laboratory Environment Inventory

CoSIL

Computer Supported Inquiry Learning

ICP

Independent Chemistry Project

I-I

Indirect Interaction

LIPI

Laboratory Instructional Practices Inventory

LOPUS

The Laboratory Observation Protocol for the Undergraduate STEM

MER

The Model of Educational Reconstruction

MR-STBI

Modified-Revised Science Teacher Behaviour Inventory

QTI

Questionnaire on Teacher Interaction

RIOT

Real-time Instructor Observation Tool

S-E

Student-Equipment Interaction

S-I

Student-Instructor Interaction

S-S

Student-Student Interaction

TA

Teaching Assistant

TA-IOP

The Teaching Assistant Inquiry Observation Protocol

TCL-GTA

Teaching as a Laboratory Graduate Teaching Assistant Program

Introduction

The laboratory is considered by many researchers to play an important role in science education (Hofstein & Lunetta, 2004; Johnstone & Al-Shuaili, 2001). With the increasing use of automation in higher education, two new forms of technology-augmented practical activities, simulated and remote laboratories, have been commonly used as alternatives or supplements for the traditional face-to-face laboratories (De Jong, Linn, & Zacharia, 2013; Lindsay & Good, 2005). For each laboratory type, researchers have made investigations from multiple viewpoints, namely the new techniques or teaching strategies implemented in laboratories (Botero, Selmer, Watson, Bansal, & Kraft, 2016; Saxena & Satsangee, 2014), the description of learning objectives/outcomes for individual laboratories (Bright, Lindsay, Lowe, Murray, & Liu, 2008), and the comparison of traditional and technology-augmented laboratories (Brinson, 2015; Faulconer & Gruss, 2018; Ogot, Elliott, & Glumac, 2003). However, most of the comparison of the three modes of laboratories has focused on learning outcomes and less on students’ learning processes and interactions. Furthermore, even though some of the simulated and remote laboratories were initially developed to be conducted in distance learning, and interactions have been systematically discussed widely in distance education, there is currently no overview of the literature on interactions in discipline-specific journals. This review addresses the characterisation of studies of interactions in hands-on/face-to-face and remote science laboratories. Simulated science laboratories will not be covered in this review because they do not entail the manipulation of physical equipment as a key activity in the conduct of the laboratory experiment.

Interactions have long been seen as important entities in science education. According to the theory of distributed cognition, learners’ performance and learning effects are significantly influenced by the interactions between the learner and the learning environment (Cole & Engeström, 1993; Nakhleh, Polles, & Malina, 2002). In science laboratories, the learning environment comprises elements such as the learners themselves, other learners, instructors, laboratory manuals, internet, equipment and/or computers. Four types of interactions, namely Student-Student Interaction (S-S), Student-Instructor Interaction (S-I), Student-Equipment Interaction (S-E) and Indirect Interaction (I-I) are common in science laboratories (Moore, 1989; Sutton, 2001). In other words, learners’ interactions occur with other students, the instructor, the equipment or vicariously (i.e. students listen to interactions in which they are not direct participants).

Traditionally, science laboratory equipment was manipulated directly by the student, meaning that the learners are in the same location as other elements of the learning environment. However, remote laboratories have become more common, where computers and other technologies are playing important roles in the learning process. In remote laboratories, the learners manipulate real equipment and the data results from their manipulation. Nevertheless, the learner and the equipment, sometimes without the instructor and other learners, are physically separated from each other. Researchers have convincingly argued that information technology has dramatically changed laboratory education (Scanlon, Morris, Di Paolo, & Cooper, 2002), although there may be disparities between the effectiveness of remote laboratories compared to traditional laboratories.

These disparities may be due to the change in learners’ interactions with the learning environment. To address this issue, first, the findings are summarised from publications that fall into two fields: interactions in face-to-face science laboratories and interactions in remote science laboratories. Existent theories that have been used in the analysis of interactions are then reviewed. Other theoretical frameworks that may be implemented for the analysis of interactions in the two types of science laboratories are also presented. Finally, conclusions and discussion of implications for future work are made.

It is acknowledged that there have been various studies of interactions in secondary-school education; however, these are not included in this article because the focus of this review is on studies in undergraduate science laboratories in university education wherein students are at a markedly higher level of academic maturity.

Scope of the review methods for this literature search

Focus questions for the review

To attempt to summarise the current state of research about the interactions occurring in science undergraduate laboratories, the main foci of this review are the methods employed in the studies, the main findings, and the theoretical foundations. These three categories are believed to guide future research in analysing interactions in science laboratories. Regarding the instruments implemented, the analysis aims to provide a structured summary of measuring interactions in science laboratories. In addition, the presentation of key findings can summarise the current achievements and the potential questions to be addressed in the future. Finally, the theoretical frameworks can bridge the connections between educational psychology and studies of science laboratories. To be specific, this review is structured around these three categories, each of which is guided by a focus question:
  1. 1)

    Measurements: What tools have been used in the characterisation of interactions in undergraduate science laboratories?

     
  2. 2)

    Results: What has been found in the research of interactions in undergraduate science laboratories?

     
  3. 3)

    Theoretical Framework: Which theories have been used in the research of interactions in undergraduate science laboratories?

     

Selection process of articles

To address the focus questions, systematic search methods were used. Two steps were involved in the collection of papers for this review. Initially, adapted from the research method used by Potvin and Hasni (2014), the ERIC database was searched by pre-defined criteria. However, because only a limited number of articles were found, a subsequent search in Google Scholar and within some journals was conducted.

Specifically, the search in ERIC was carried out on July 17, 2018 and the procedures were as follows: ‘laboratory interaction’ was assigned as the keywords that might appear in abstracts or titles, the research subjects were science and some science subjects (including biology, chemistry, physics, geology, astronomy and ecology). The codes for searching were:

(title: ‘laboratory interaction*’ OR abstract: ‘laboratory interaction*’) AND (title: (science* OR biolog* OR chemistr* OR physics OR geolog* OR astronom* OR ecolog*))

The publication years and the type of source (peer-reviewed journals and theses) were not constrained. The education levels were focused on science education in postsecondary education, higher education, and two-year colleges as these are the targeted groups of this review paper. From the first step, 62 articles were obtained. After scanning the abstracts and titles, only 16 articles were retained. The articles that were excluded were about the intermolecular forces in the micro atomic level (20) or studies not in science (1), or those that covered more general concepts of interactions (19), or about simulation laboratories (5). One journal article closely related to the first author’s thesis was retained while the thesis was removed.

Since only one article was found about interactions in remote laboratories in ERIC, a next-step search was conducted in Google Scholar (2018), using the keywords: “laboratory interaction” undergraduate remote science. The time period was set as 1996–2018; because 1996 was considered to be the year when remote-access learning began to be implemented in engineering education (Aktan, Bohus, Crowl, & Shor, 1996). Among the 72 results illustrated, after scanning the titles and abstracts, only one focused on interactions in remote science undergraduate laboratories. The other publications were either about interactions in science in educational contexts (58), or about virtual laboratories (4), or did not match the predefined age group (8). The results showed that compared to face-to-face laboratories, there were few studies that were directly related to interactions in remote laboratories, or at least this was not clearly shown in the titles and/or abstracts.

Because only a small number of relevant works had been found using the foregoing interrogations, the search database was changed from Google Scholar to journal websites and the focus was moved to the remote laboratory itself, not just in interactions. The Boolean conditional keyword phrases used in the search included “remote laboratory/experiment” and “real laboratory/experiment”. As it is not long since remote laboratories began to be integrated into the science undergraduate curriculum, the year of publication was constrained to the last 10 years, ranging from 2009 to 2018. The journals included: Computers & Education, Computers in Human Behavior, and Distance Education. These journals were chosen because they are dedicated to research on the use of advanced technologies from educational and/or psychological perspectives. In addition, remote laboratories have been used in distance education, so the Journal of Distance Education was searched. Among the 726 articles found, the following criteria were conducted to filter the literature by scanning the titles and reading the abstracts: (1) only articles focusing on remote laboratories were retained. For example, some articles were deleted because their foci were on simulated laboratories (Scalise et al., 2011; Stang, Barker, Perez, Ives, & Roll, 2016) (2) articles on engineering-related education instead of science were deleted (Corter, Esche, Chassapis, Ma, & Nickerson, 2011; Kamruzzaman, Wang, Jiang, He, & Liu, 2015) (3) articles that did not address education-related problems were not retained (Botero et al., 2016; de la Torre, Guinaldo, Heradio, & Dormido, 2015; Orduña et al., 2015) (4) only research papers were retained while review articles and book chapters were not included; articles (5) which were not related to laboratory learning, but in a broader range of distance education were not included. After the selection, only four articles were retained.

Since there were only 22 articles (16 + 1 + 1 + 4) retained about interactions in face-to-face and remote laboratories, a final search was made in both of the two areas. A search of citation and reference lists and a search of the publications which had cited the chosen articles were conducted. As a result, a total of 21 papers on interactions in face-to-face and 6 articles on interactions in remote laboratories were selected. We remark that this low total may indicate a current shortage of research work having been conducted on student interactions in university laboratory learning.

Interactions in face-to-face science laboratories

Research on interactions in face-to-face science laboratories appeared earlier than their counterpart studies in simulated and remote laboratories and continue broadening to studies with many new perceptions. Observers have created and are updating a wide variety of observation instruments to capture both the instructors’ and learners’ interactions. Techniques used to identify and collect these interactions have changed from paper-and-pencil and audiotape recordings to more advanced computer-assisted means. Such tools have been used to investigate the connection between learners’ behaviours and their learning outcomes. Thus, researchers have analysed the content, nature, and the functions of interactions, to identify not just their frequency but also to identify the influences of frequency on learning. Consequently, both quantitative, qualitative and mixed methods in data collection and data analysis have been used. In brief, researchers have described what the interactions were and how they worked. Below the tools developed to this end and the functions of the interactions that were measured are overviewed.

Variations of measurement tools in face-to-face science laboratories

As illustrated by the following authors, the designs of observation tools were motivated by the desire to assess learning or instructional effects in some reformed curricula such as by inquiry (Hilosky, Sutman, & Schmuckler, 1998; Sadler, Puig, & Trutschel, 2011; Stang & Roll, 2014; West, Paul, Webb, & Potter, 2013). These authors chose observation as the main method of data collection because observation could provide direct and reliable data to record and describe students’ and instructors’ behaviours (Lund et al., 2015). Most of these observations provided information about the frequency of interactions and whether or not the interactions were verbal and/or non-verbal. Consequently, the results were quantitative rather than qualitative. A summary of these studies is presented in Table 1.
Table 1

The name and category of observation tools, types of interaction and main findings in face-to-face science undergraduate laboratories

Observational Tools Included

AAAS Category

Interactions

Main Findings

References

Science Laboratory Interaction Categories (SLIC)- Student

Segmented

Verbal and non-verbal

Most time was spent on transferring information

(Kyle et al., 1979)

N/A

N/A

Verbal

Most of the laboratory interactions were about laboratory procedures

(Lehman, 1990)

A Modified-Revised version of the Science Teacher Behaviour Inventory (MR-STBI)

Segmented

Verbal and non-verbal

Instructor behaviours are different in U.S. and German institutions of higher education

(Hilosky et al., 1998)

Modified from Science Laboratory Interaction Categories (SLIC)

Segmented

Verbal and non-verbal

Instructors varied in the six science disciplines

(Ajaja, 2013)

Computerised Real-time Instructor Observation Tool (RIOT)

Continuous

Verbal and non-verbal

S-I interaction varied in both small group and whole class observations.

(West et al., 2013)

Teaching Assistant Inquiry Observation Protocol (TA-IOP)

Holistic

Verbal and non-verbal

Peer reflection can help TAs’ teaching in inquiry laboratories.

(Miller et al., 2014)

TA Observation Form (TA behaviours)

On-off task form (student engagement)

Segmented

Verbal

S-I interactions could possibly predict the student engagement

(Stang & Roll, 2014)

Laboratory Observation Protocol for the Undergraduate STEM (LOPUS)

Segmented

Verbal and non-verbal

Students’ behaviours were independent from the instructor’s style. The nature of interactions is related to laboratory activities.

(Velasco et al., 2016)

All the observation instruments in the aforementioned papers/studies can be allocated to three categories described by the American Association for the Advancement of Science (2013): holistic, segmented or continuous. With holistic instruments, the observer documents the instructors’ and learners’ behaviours altogether and rates them according to pre-defined criteria. With the segmented instrument, the observer records the instructors’ and students’ performance at short intervals, normally every few minutes. The only continuous protocol, developed by West et al. (2013), requires the whole observation to be conducted without any break.

Some earlier studies focused on the frequency of student interactions in science laboratories. For instance, Kyle, Penick, and Shymansky (1979) compared the frequency of a wide range of interactions in both introductory and advanced laboratories in five college science disciplines. The occurrence of the interaction of each student was identified using the Science Laboratory Interaction Categories (SLIC) that listed ten main student activities. The authors further compared the frequency of interactions according to the level of laboratories (introductory or advanced) and the five science subjects. Their study focused on undergraduate behaviours in science laboratories and the results mostly comprised a description of the frequency of interactions. This research by Kyle et al. (1979) showed that students spent a large amount of time in reading and writing, more than that for experimenting. The students also listened a lot, both from instructors and their peers, and the authors concluded that these listening activities were more about information transmission and less about question-answering or problem-solving. No significant differences were found for the ten detailed interactions, except for the amount of time spent on reading and writing, existing within and among the laboratory levels and science disciplines.

In addition to the foregoing type of observation in whole classes that documented both verbal and non-verbal interactions, some researchers were more interested in the frequency of verbal interactions. For example, Lehman (1990) divided the various verbal interactions into 13 types: five Student-Student interactions, four student-initiated Student-Instructor interactions, and four instructor-initiated Student-Instructor interactions, and documented the length of students’ verbal interactions according to these categories. The findings showed that students interacted a lot with their classmates, referred to laboratory equipment or read laboratory manuals. As for S-S interactions, the style of laboratories had an impact on them, and more time was spent on procedures in unfamiliar laboratories, whereas more time was spent on data collection in familiar ones. By contrast, only 11% of the total interactions were related to Student-Instructor interactions. It needs to be noted that although Lehman did not define his observation instrument according to the American Association for the Advancement of Science categories; it was inferred to belong to the structured observations category.

The two studies (Kyle et al., 1979; Lehman, 1990) discussed above were both concerned with the frequency of interactions from the students’ perspective. In other words, the observers concentrated more on how student behaviours influenced their learning. The instructor's behaviours were only captured when he/she was interacting with the students. By contrast, in many other types of studies, the researchers focused on how to improve instructional capacity by designing the observations to document the teachers’ interactions with their students (Cohen & Ball, 1999). One example is the study made by Hilosky et al. (1998) where the observations were based on the Modified-Revised Science Teacher Behaviour Inventory (MR-STBI) in both general and introductory chemistry laboratories, in one German and 16 American higher education institutions. A further interview with each instructor clarified why some interactions were more or less important. The results by combining the MR-STBI and the interviews illustrated that the design of the laboratories did not consider highlighting higher-order thinking. In addition, using the modified form of SLIC, Ajaja (2013) observed 48 instructors’ behaviours from six areas of science laboratories. The findings from their observations implied that SLIC could be a reliable instrument to collect information about the instructor’ behaviours in science undergraduate laboratories. The information could be used as further evidence for the instructors to self-assess their behaviours or for the institutions to prepare instructor training.

The relationship between interactions and student engagement has been the focus of many researchers’ interests. In two studies (Sadler et al., 2011; Stang & Roll, 2014), student engagement was considered an important factor in describing or correlating with interactions. Sadler et al. (2011) developed a tool - Laboratory Instructional Practices Inventory (LIPI) - to assess laboratory instruction in transformative courses. Two main ideas were included: levels of student engagement and the content of student discourse. This observation tool was designed for instructors or laboratory coordinators to improve the quality of laboratory learning according to students’ perceived learning processes. Another group (Stang & Roll, 2014) included student engagement level as a research element in an attempt to find the relationship between teaching style, student engagement, and student learning. The Teaching Assistant (TA) observation form was designed to collect three main instructor interactions. The modified on-off task form from (Ocumpaugh, 2015) was used to record whether the students were engaged or not. In addition, pre- and post-lab multiple questions were used to compare the learning achievements. Results showed that only the frequency of teacher-initiated S-I interactions positively influenced engagement while the rate of student-initiated S-I interactions and the length of interactions had no effect on engagement. On the other hand, students’ engagement and their learning were mutually influential. In summary, the authors suggested that instructors should interact with their students more actively and frequently to increase student engagement and thereby improve student learning outcomes.

Driven by technological breakthroughs and their applications in education, the stages in data collection and data analysis of laboratory interactions varied from microphone and audio-tape to more computerised or online tools. West et al. (2013) introduced a computerised Real-time Instructor Observation Tool (RIOT) to analyse S-I verbal interactions. In this research, data analysis, especially in a quantitative way, was more efficient in the computer-based approach. The Internet also improved efficiency in communication and information exchange between different institutions. Based on the theory developed by Cohen and Ball (1999) that interactions between students, instructor, and materials each influence instructional capacities, Velasco et al. (2016) developed an observation instrument - the Laboratory Observation Protocol for the Undergraduate STEM (LOPUS) - to investigate the TAs’ instructional practices. It was found that students’ behaviours were independent of TAs’ instructional characteristics, and the TA initiated fewer verbal conversations than did the students. It was also reported that TAs behaved similarly in all laboratories, whereas the nature of S-I verbal interactions varied according to the types of laboratories.

In contrast to the studies described above, Miller, Brickman, and Oliver (2014) developed a peer assessment tool for one TA to assess the other TAs’ teaching using the Teaching Assistant Inquiry Observation Protocol (TA-IOP). Both novice and experienced TAs observed other instructors’ laboratories in different periods of the whole semester. The new TAs learned a great deal from their peers and made changes in their own teaching. The researchers also recorded the frequency of student and instructor interactions using this holistic observation instrument.

Although none of the above observation tools was able to capture all of the instructor’s or learners’ interactions, these observation tools could provide a general view of the students’ and instructors’ interactions in the laboratory classroom for researchers and instructors to understand and anticipate what does and what may happen in a science laboratory. In some studies, correlations between the frequency and time spent on different kinds of tasks and student engagement were presented (Sadler et al., 2011; Stang & Roll, 2014) while others compared inquiry learning and instructor behaviours (Hilosky et al., 1998). However, all of the former information was too general to describe the link between interactions among participants and their learning outcomes. A detailed analysis of various interactions with further differentiation and their educational influences is needed to answer this question. This gives rise to the second aspect of interaction studies: the nature and function of interactions, which were linked to focus question 2.

Findings of interactions in face-to-face science laboratories

Studies of interactions are not just about their frequencies but also focus on the analysis of the nature and functions of the interactions. Compared with the previous type of studies that were concerned with describing the interactions that happened in the classroom, the second type of studies emphasized the relationship between interactions and learning outcomes. Accordingly, researchers have tried to correlate results from surveys with students’ learning achievements while others examined the way that learning environments influenced learning, mainly by investigating the content of conversations. Although observations were still being used by researchers as one of the main methods to examine the functions of interactions, the researchers in these studies did more than just record the frequency of learners’ and instructors’ interactions. A summary of these studies is presented in Table 2.
Table 2

Summary of tools/methods, types of interactions and the major findings of interactions in face-to-face science undergraduate laboratories

Tools/Methods

Interactions

Main Findings

References

Practical Tests Assessment Inventory (PTAI) adopted from Tamir, Nussinovitz, and Friedler (1982)

Comparative, cooperative, and individualised S-S interactions

Competitive interactions were proved to be more potent than cooperative and individualised approaches.

(Okebukola, 1984)

Questionnaire on Teacher Interaction (QTI)

S-I interactions

Teacher behaviours are more strongly related to student learning outcomes than curriculum content.

(Wubbels, 1993)

Bloome’s multiple levels of interactions

Verbal and non-verbal

This naturalistic study shows that instructor-student interactions illustrate different features in different stages of the laboratory.

(Roychoudhury & Roth, 1996)

The self-developed method through constant comparison

Verbal

Students interact less in inquiry laboratories than non-inquiry approaches.

(Krystyniak & Heikkinen, 2007)

Ethnographic and mixed-method comparison of verbal discourse

Verbal and non-verbal

A comparison between the three groups indicated that to effectively develop conceptual understanding, friendly and relatively critical group atmosphere was required.

(Oliveira & Sadler, 2008)

Interviews, observations, and video documentation

Verbal and non-verbal

The instructor-student interaction was helpful in guiding students’ activities.

(Högström et al., 2010)

Sociocultural discourse analysis

Verbal and non-verbal

Students favoured proposing ideas more than asking questions in a higher level of inquiry laboratories

(Xu & Talanquer, 2013)

Tuckman’s stage model

Verbal and non-verbal

Instructors can take some methods to realise the students’ behaviour and foster their peer interactions.

(Gresser, 2006)

The constant comparison method

S-I verbal

The S-I verbal interactions were influenced by the laboratory content

(Flaherty et al., 2017)

Eighteen-category items of teacher/student interactions selected and modified after Ogunniyi and Ramorogo (1994)

S-I Verbal and non-verbal

Human-machine interaction in computer-based instruction learning environments: learners need to reallocate cognitive gains and effort and examine possible sources of error.

(Kiboss, 1997)

A modified version of the Fennema-Sherman Mathematics Attitude Scale (FSMAS) and a questionnaire

Student Attitude

TAs influence the students’ attitudes to the biology laboratories to a great extent.

(Rybczynski & Schussler, 2013)

Student and teaching assistant questionnaires

S-I Interactions

To keep the same TA (over different sessions) gave higher learning outcomes for students than with expert TA model.

(Good et al., 2015)

A pre- and post-lab survey

S-S, S-I, S-E and I-I Interactions

Students with different academic values had different opinions of interactions

(Wei et al., 2018)

A widely-used multiple-choice questionnaire - the Questionnaire on Teacher Interaction (QTI) (Wubbels, 1993) - was designed to identify the interpersonal relationships of the learners and teachers. The teacher behaviour was divided into four dimensions: dominance, cooperation, submission, and opposition, which were further categorised into eight equal sessions to resemble teacher behaviours. The results showed a close relationship between teacher interpersonal behaviour and students’ learning outcomes: the positive relationships were leadership, friendly and understanding; while the negative relationships were uncertain, dissatisfied and admonishing. Contradictory results for student achievement and attitudes in sectors Dominance-Opposition and Submission-Cooperation meant that there seemed to be a conflict for the teachers in being strict and at the same time giving students responsibility. Similarly, the connection between the curriculum and students’ learning outcomes were not strong. Another research direction provided by Rybczynski and Schussler (2013) used a modified version of the Fennema-Sherman Mathematics Attitude Scale (FSMAS) (Fennema & Sherman, 1976) and an online questionnaire including mainly open-ended questions to assess the students’ attitudes to undergraduate biology laboratory classes. Their findings showed that interactions with the TAs played important roles in the students’ attitudes to the laboratories both positively and negatively. It was also concluded that in the biology laboratories, the TAs had more important influences than the laboratory content on the students’ attitudes. Another example was made by (Wei et al., 2018), in which a self-designed pre-lab survey was conducted to collect information about the importance of interactions before the commencement of the laboratory, and a post-lab survey was used to measure the frequency of interactions after the completion of the laboratory activity. The results from the survey were validated by the results of on-site observations. The research showed that students with different laboratory marks had various anticipations about the importance of interactions before the laboratory. In addition, those students with lower marks placed a high reliance on passive interactions such as with the laboratory manual and the internet.

Other researchers have used qualitative content analysis, instead of multiple-choice instruments, to describe and inspect the nature and functions of interactions. In one example of this genre of research, using a naturalistic method in one inquiry-based laboratory, Roychoudhury and Roth (1996) investigated how collaborative work influenced students’ learning experiences in the science laboratory and consequently their effects on learning in three representative groups. The authors videotaped each group’s activities and transcribed the recordings into analysable data, then interpreted the interactions with group members and between the groups and teachers’ conversations according to Bloome’s cultural framework of multiple levels of interactions (Bloome, 1989). Bloome’s theory about status difference within group members was affirmed. Three types of interactions within group members were identified: (1) symmetric interactions: the role of group members were equal, no one dominated the discourse for a prolonged time and members shifted their duties naturally; (2) asymmetric interactions: only one member discussed with the teacher and other group members only asked questions occasionally, which happened more in the process of conceptualisation; and (3) shifting asymmetric interaction: this kind of interaction had both components of symmetric and asymmetric interactions, the dominant people varied from one student to another within a notable time period. However, although more frequently interacting with others meant higher levels of participation of some students, the research did not definitely show that these students had higher levels of academic achievements. In the laboratory data collection stage, all group members had equal involvement. For the S-I interactions, there were two types in different stages of the laboratories: discussion about conceptualization in the planning and data interpretation stages; and an advisory role of the teacher in the data collection stage. Overall, from the observations of overt participation in the discourse, no relationship between interactional patterns and academic achievement of group members was identified. However, the researchers proposed that the teacher’s intervention had an impact on students’ learning and that teachers should promote passive members’ involvement.

In another study, Krystyniak and Heikkinen (2007) differentiated categories of interactions based on the constant comparison method (Glaser, 1965) and documented the transcribed verbal interactions of each group into various parts, conducted both in Independent Chemistry Project (ICP) and non-ICP chemistry laboratories. By comparison, it was found that ICP could help students focus more on concept development and less on procedural steps. Högström, Ottander, and Benckert (2010) connected S-I and S-S interactions with learning experiences using one explanatory method that analysed the verbal and non-verbal behaviours of the students and the teacher. Three components – safety and risks, procedures and equipment, and chemical concepts - were the main forms of S-I interactions; and the first two topics were prominent in S-S interactions. The authors also pointed out that S-I interactions were useful in promoting learners’ thinking and acting. Flaherty, O’Dwyer, Mannix-McNamara, and Leahy (2017) also employed the constant comparison method to evaluate the impact of Teaching as a Laboratory Graduate Teaching Assistant Program (TCL-GTA) in S-I psychomotor and cognitive verbal interactions. After coding and comparing the S-I discourses in each of the three stages of the chemistry laboratories, the authors found that implementation of the TCL-GTA program had increased the frequency of both psychomotor and cognitive verbal interactions. In other words, the level of conversations related to concepts was also developed with the process.

Besides qualitative methods, mixed-method studies were also common in categorising the functions of interactions. As an example, Xu and Talanquer (2013) collected data using the method of running records (Poulsen et al., 1995), whereby the researchers recorded the behaviours of one group closely in each laboratory, at the same time trying not to interfere in their activities. It was found that in the higher level of inquiry laboratories, episodes of proposing questions, exploratory approaches, and a domination of few students directing the process became more common. Roychoudhury and Roth (1996) focused on analysing the characteristics of group verbal and non-verbal interactions but used only one sociocultural discourse analysis method (Kumpulainen & Mutanen, 1999) to describe the effect of level of inquiry on interactions comprising three categories: language function, cognitive processing, and social processing. Results showed that changes in levels of inquiry in the science laboratory could be one factor but not a determining factor on learners’ higher level of cognitive processing.

Oliveira and Sadler (2008) examined collaborative learning in three representative groups using a combination of ethnographic and mixed-method discourse analysis. According to their description of elaboration within groups and dealing with conflicts, as well as the conversation analysis, different characteristics were found in these three groups although the instructional direction was almost the same. For Group 1, most of the peer interactions were in the form of non-elaborated conceptions, while unsolicited self-elaboration occurred sporadically such that the whole process tended to lead to vagueness. For Group 2, group members interacted with each other with gradual elaboration, which led to well-articulated conceptual constructs. For Group 3, there was a confrontational atmosphere and minimal convergence. This study also gave suggestions on how to combine social culture and social cognition together to promote science learning. The results from Oliveira and Sadler (2008) illustrated that although collaborative actions were efficient in developing concepts, some factors might hinder these results, such as students’ preferences, group culture and timely instruction from teachers. By contrast, Gresser (2006) was concerned more with the relationship between group sharing of the epistemological framework and constructive work, as well as the social patterns inside the group activity. After observing several case studies, groups with a good understanding of the laboratory activity arrangements led to more efficient results; groups with a less common understanding of the target were directed by the dominant person, who acted more like an individual doing the work and without cooperation.

In his study, Blickenstaff (2010) proposed a framework for analysing physics education, in which the programme, the teachers, the students and the collective working units (groups) were the elements of the classroom. Using this framework as guidance, the author observed the physics laboratory classes and interviews with focus students. There were no pre-designed observation protocols but the combination of overall observations with the whole class and with individual groups was used. The semi-structured interviews were designed to check the results from the observations. The findings reported in this article emphasized the importance of students’ conversations and interactions with their groups in the students’ learning. The research methods also provided rich information about the learning process in physics laboratory classrooms from the aspect of interactions.

Except for the analysis of the contents of interactions, some other findings provided information about the links between interactions and learning from other viewpoints. One example was from Good, Colthorpe, Zimbardi, and Kafer (2015), in which the comparison was made based on different forms of TA-to-student model. One model was the students worked with the same TA (consistent) for the whole term, while for another model, the students did not stay with the same TA (the expert TA model). Students who were assigned consistent teaching assistants performed better than those with the expert TA model. The TAs also explained that this format helped them to create a strong relationship with the students and positively influenced their learning experiences.

Overall, in face-to-face laboratories, the main forms of interactions of interest to researchers were the verbal interactions rather than the non-verbal ones. However, compared with the critical review of interactions in the science classroom by Power (1977), current studies have not changed in many ways. The two approaches proposed by Power, namely using observation as the main data collection method, and the influences of the interactions, are still prominent nowadays.

Interactions in remote science laboratories

The appearance of remote laboratories was closely related to technology innovations, as they would not be possible without the support of advanced technology. However, although technology underpins the development of remote laboratories, technology is only the interface, not the ultimate goal. It is the interaction and engagement between the learner and the technology, not the technology itself that imposes function on the learning outcomes (DiSessa, 2001). The notion that the use of remote laboratories is to improve learning or as a way to supplement traditional laboratories is central to their development and use. As to interactions that occur in remote laboratories, on the one hand, they were transformed in terms of technology improvement; on the other hand, this transformation aimed to serve the need to improve learning, especially in students’ conceptual understanding.

In remote laboratories there is a physical and psychological separation between learners and equipment; thus, it is necessary to use technology-supported learning interfaces to reduce this separation resulting from learners’ lack of experience, creating decreased satisfaction levels (Lindsay, Naidu, & Good, 2007). For example, although most students in this study believed that they had conducted a real laboratory and the data were authentic, they still preferred a hands-on laboratory. They also proposed that they were not feeling personally engaged in the laboratory because of the separation from the apparatus (Lowe, Newcombe, & Stumpers, 2012).

A ‘real’ remote laboratory requires that the learners manipulate the equipment individually and no instructors or other on-site peers are with them at that time. Learners can only receive instant help from embedded help formats or online search engines; information from sources such as online discussion groups or feedback from instructors are sometimes available but are not simultaneous. These differences mean that support from technology is indispensable. Therefore, some studies are concerned with improving technology to enrich students’ learning experience and learning outcomes. On the other hand, some researchers are interested in whether or not to use a ‘false’ remote, meaning to retain some elements such as online help from instructors or conduct group work during the laboratory process instead of individual work. Although some researchers believe that not as many interactions were required in remote laboratories (Scheucher, Bayley, Gütl, & Harward, 2009), the present review endeavours to examine whether the change of interactions influences students’ learning.

Studies of remote laboratories fall into three main types: (i) introduction to developed hardware or software applied in remote laboratories, (ii) comparison of learning outcomes in the two laboratory modes (face-to-face and remote), and (iii) the creation of effective collaboration and a communicational learning environment. Although none of the three categories directly describes the influences of interactions on learning, or even uses the term ‘interaction’, analyses were made to explore the three elements in a learning environment, student, instructor, and material, as well as their connections with learning. Accordingly, the interactions implicitly covered in the literature were divided into three categories: Student-Student, Student-Instructor, and Student-Equipment, to explore their influences on students’ learning. Thus, in Table 3 we have summarised current knowledge of how the interactions between a student and the other elements were summarised, whether they are instructors, other learners, laboratory materials such as laboratory manual or equipment, or technology, have changed in remote laboratories and how these changes have influenced learning. The interactions are then described in the following subsections.
Table 3

Interactions and major findings in remote science undergraduate laboratories

Subject

Type of Interactions

Major Findings

References

Physics

S-S, S-I interaction via video and text chat

A combination of the simulated and remote laboratory provided more rich collaboration between learners, as well as between learner and instructors.

(Scheucher et al., 2009)

Science and engineering

S-S, S-I, S-E interactions

The frequency of S-S and S-I interactions decreased

(Cooper & Ferreira, 2009)

N/A

S-S and S-I interactions

The Collaborative Support system increased student engagement and the number of completed assignments.

(Luis de la Torre et al., 2013)

Science

S-I Interactions

Students were more engaged in remote laboratories watching the real live video.

(Sauter et al., 2013)

Chemical Engineering

S-S, S-I and S-E interactions

The Cooperative Weblab increased higher student engagement levels.

(Le Roux et al., 2009)

Chemical Engineering

S-S Interactions

Students preferred three members in a group.

(Botero et al., 2016)

Student-student interactions in remote laboratories

The Student-Student Interactions in remote laboratories occur between one student and other students, as happens within groups or individually, with or without the presence of an instructor (Moore, 1989). These interactions may be synchronous or asynchronous, in the form of e-mails, blackboard communication, or web ‘chatting’.

A remote laboratory comparison was made by Böhne, Faltin, and Wagner (2007) when allocating student pairs into two types: in one type the two students were in different rooms and could only contact with each other online, while in the other type the students were in the same room and could see and talk to each other. A tele-tutor was in the laboratory process simultaneously for each kind of group. Results showed that if all the influencing factors were combined together, only initial knowledge was strongly correlated with task success, while group setting did not. However, if initial knowledge was eliminated from the factors, group setting would significantly correlate with task effectiveness. Another group’s findings, from a student questionnaire, revealed that the students favoured working in three as a group in remote laboratories (Botero et al., 2016).

Other research focused on creating online platforms to increase cooperation between learners. For example, by developing the Cooperative Weblab in chemical engineering, Le Roux et al. (2009) showed that this platform increased teamwork skills and promoted higher-level interactions among learners because of working with unfamiliar colleagues and the embedded open-ended questions. Another example (de la Torre et al., 2013) introduced a collaborative approach where other people can be invited by the learners/instructors in the virtual and remote laboratory sessions and interact with each other and share the learning process simultaneously. Students showed an increased level of engagement and the number of completed laboratory assignments also improved with the help of this collaborative approach.

Interestingly, it was proposed in one study (Corter et al., 2011) that in face-to-face laboratories, the data-collection mode in groups showed an advanced relationship with the learning outcomes, while in the remote laboratory mode, the individual data-collection condition had more advantages than group data collection form. This finding may imply that in remote laboratories, the S-S interactions did not play as many important roles in the learning process as in the face-to-face ones. Although this study was made in the discipline of engineering, the findings can provide some directions for the studies in science remote laboratories.

Student-instructor interactions in remote laboratories

It is widely thought that instructors play important roles in face-to-face chemistry laboratories (Herrington & Nakhleh, 2003) and that their behaviours have positive impacts on students’ learning outcomes (Stang & Roll, 2014). Compared with the S-I interactions in physical laboratories, sometimes there are no instructors when the students are conducting the remote laboratories because it is hard to provide 24/7 services in remote laboratories. Even though some groups had instructors involved, the instructors were not physically together with the students. Some research studies with remote-access laboratory examples where there were supervisors or teaching assistants in the room or via tele-tutorial support over the internet (Böhne, Rütters, & Wagner, 2004) may provide hints about whether the absence of instructors influences students’ learning. Böhne et al. (2004) used synchronous tele-tutorial methods such as desktop sharing, video or audio talking between instructor and learners in a remote setting. They also designed two forms of directed learning: instructor-directed and self-directed learning. In instructor-directed learning, the students received a large amount of help from the instructor and had to report to the instructor on the progress at the end of each task. In self-directed learning, the students only received hints from the instructor about where to find the information. To summarise their findings, the asynchronous human instructor was necessary to tackle some specific problems in remote laboratories, being available to help students online instead of being physically with them. These instructors did not participate a great deal in the learning process but provided help only when needed. The form of online help could be audio chat and desktop sharing whereas social cues like a gesture or facial images could not be used in this process.

Cooper and Ferreira (2009) introduced a framework to evaluate the pedagogical effectiveness of remote laboratories. The results showed that S-S and S-I interactions had decreased after the implementation of remote laboratories. One possible explanation was that the learners might feel disappointed if they did not have unlimited access to the instructor even though they had unlimited access to the remote equipment.

Student-equipment interactions in remote laboratories

Even though some researchers proposed that an interface or equipment “merely acts as a confounding intermediary” between learner, instructor, and content (Hillman, Willis, & Gunawardena, 1994), other researchers argued that the machine or monitor was an important communicating medium for learners, which could be helpful in “the mentoring, instruction, tutoring and assessment of students” (Henry, 2000). Remote laboratories have some unique functional features which cannot be provided by face-to-face laboratories (Cooper & Ferreira, 2009); for example, students can work from a library of instrument panels, receive an abundance of information instantly from the help function, or have embedded formative assessment to receive instant feedback. The influences of computerised learning environment should be analysed.

Researchers have studied the impact of S-E interactions in remote laboratories on learning. Sauter, Uttal, Rapp, Downing, and Jona (2013) made a cross-comparison of four laboratories based on two influencing factors: the lab type (remote or simulation) and the representations (photo or videos). The remote users were more satisfied with the benefit from computers of reducing human error. The users who had watched the live video showing the process of data collection felt more engaged and were better at explaining some content knowledge compared with those who only viewed a picture. This phenomenon illustrated that the design of the interface that connects the learner and the equipment heavily influences the learners’ experience and learning outcomes.

Overall evaluation of remote-laboratory interactions

Overall, in remote laboratories, there may be fewer interpersonal interactions than in face-to-face science laboratories depending on the situation. For example, in some remote-laboratories, the learners work in groups, with or without an instructor, while in other examples, learners conduct the experiment independently without any pre-designed interpersonal information exchange. It is thought that the decrease in interpersonal interactions may be supplemented by S-E interactions provided by advanced technologies. However, current research is not rich enough to provide sufficient support to validate this assertion.

Researchers working in the area of Computer Supported Inquiry Learning (CoSIL) have made valuable contributions to the development of tools in guiding online inquiry laboratory process (De Jong, 2006; Quintana et al., 2004). In their review paper, Zacharia et al. (2015) provided evidence of the relationship between suitable guidance in the inquiry process and student learning in a CoSIL environment and recommended personalised guidance. Although the studies of CoSIL are not necessarily related to this review, it is still believed there are overlaps, and that their findings and work can potentially direct the design and study of interactions in remote laboratories. Detailed information can be found in the aforementioned papers.

Theoretical frameworks for interactions in face-to-face and remote laboratories

The investigations of interactions in science-laboratory classrooms over the past decades have provided valuable information about laboratory learning. However, some of their usefulness is limited by the lack of operationalizing a learning theory in the specific context of a science laboratory. Theories that some researchers have proposed that may be used in the analysis of interactions in science laboratories are discussed in this section.

Roychoudhury and Roth (1996) implemented Bloome (1989) and Bowers and Flinders (1990)’s cultural framework ascertaining that classroom culture sheds light on knowledge construction in science laboratories. The culture framework proposes that each classroom has a unique culture that develops over time and influences the interactions among the persons involved. Based on the work of Roychoudhury and Roth (1996), the cultural framework was deemed appropriate to analyse the nature of interactions within individual groups and between groups and instructors.

Kumpulainen and Mutanen (1999) illustrated a sociocultural analytical framework that included three dimensions: functional analysis, cognitive processing, and social processing. This analytical discourse theory also emphasized that culture played important roles in knowledge construction and attempted to underpin patterns of peer group interactions. The framework has been widely used in collaborative learning research (Mercer, Littleton, & Wegerif, 2004; Xu & Talanquer, 2013) in different learning contexts.

Similar to previous frameworks which emphasize the importance of the environment on learning, Komorek and Kattmann (2008) developed the Model of Educational Reconstruction (MER), referring to the interconnection of science content, students’ perspectives and learning environment. Wei et al. (2018) applied this model to a study of interactions in science laboratories. The key finding was that students’ perceptions, course content, and learning environments interrelate with each other and have mutual influences.

The earlier three models were implemented in face-to-face science laboratories. For remote laboratories, there is only one example that attempted to involve theory and this was for engineering laboratories. Tirado-Morueta, Sánchez-Herrera, Márquez-Sánchez, Mejías-Borrero, and Andujar-Márquez (2018) adopted Kolb (1984)’s theory of experimental learning to assess learning and the design of remote laboratories. They illustrated that learning happened through interactions/interactivity between students and other people, or the equipment. During these interactions, learning occurs in the four phases of Kolb’s theory. The group then implemented Kolb’s theory in the design of practical remote laboratories and assessed students’ perspectives. However, this study is from engineering laboratories, even though the possibility of it being implemented in remote science laboratories is high, more evidence is needed to demonstrate its generality.

Theories developed for conventional laboratories could be applied or adapted to guide studies in remote laboratories. Distributed cognition (Nakhleh et al., 2002) was recommended as one possibility for the following reasons. Technologies play important roles in remote laboratories in the information exchange for students. Therefore, students’ interactions with other mediators, such as other learners, instructors, equipment and computers have significant influences on their learning process and learning outcomes. The distributed cognition theory arises from the consideration of knowledge being developed in the interactions with environments, not rooted in individuals (Cole & Engeström, 1993). Cole and Engeström added the new element of time to show the dynamics of interactions between learners and technology, based on the mediational triangle presented by Vygotsky (1978). In this assumption, technology is not a passive mediator to transact information but influences the learner and knowledge foundation. The application of this idea was reinforced by Garcia (2002) who implemented several simultaneous interactions using the interface technology in remote laboratories, By contrast, the model of Cole and Engeström did not permit simultaneous interactions because it only applied to one student dealing with one tool and one curriculum, and therefore did not exploit the different opportunities offered by the remote-laboratory mode.

To sum up, there is limited literature on the implementation of theories in studies of interactions, especially in remote laboratories. However, it is necessary to apply relevant theories to studies on interactions. Theoretical frameworks are able to connect the researcher to existing knowledge, and guide researchers to explain, not just to describe the observations but to identify the key variables influencing a phenomenon. Tirado-Morueta et al. (2018) have effectively shown that existent theories can be combined into studies of remote laboratories in the engineering area. It is therefore recommended that in science remote laboratories, theories be combined with the studies of interactions.

Discussion and future directions

The goal of this review was to understand the previous findings of interactions in both face-to-face and remote laboratories. Although studies of interactions have different characteristics and different emphases, each can contribute to an understanding of student learning in the laboratory. Three aspects, namely measurement methods, main findings, and theoretical frameworks, were focused on in the article selection and grouping, based on the two types of laboratories. Even though the findings are divided into two main categories based on the laboratory type, the emphasis of this review is not on the comparison between studies of interactions in face-to-face and remote laboratories, but on the combined findings and to use the current state-of-the-art to direct future research on student interactions in laboratory learning. The boundary between face-to-face laboratories and remote laboratories has become less significant in recent years with more face-to-face science laboratories being mediated by computers (Ma & Nickerson, 2006). It is, therefore, less necessary to focus on the difference between the two types of laboratories, rather than to focus on what can be learnt from each other’s study. The following discussion will provide suggestions for future work by summarising current existent studies.

In studies of face-to-face laboratories, the observation tools were used as collecting data about students’ and/or instructors’ behaviours. Initially, data obtained from the observation instruments were only analysed to capture behaviours (Kyle et al., 1979; Lehman, 1990), while in the later years, results from the instruments were connected with other data to understand the impacts of interactions (Hilosky et al., 1998), or with the investigation of student engagement (Sadler et al., 2011; Stang & Roll, 2014). The studies were also improved by the integration of more theories as guidance (Velasco et al., 2016).

As to the nature and function of interactions, although no studies reported that the frequency of interactions impacted the students’ learning outcomes, it was illustrated that interactions, especially those with instructors influenced the students’ attitudes to science and engagement levels in the laboratory activities. Even though some of the S-I interactions had both positive and negative effects, instructors were encouraged to promote passive members’ involvement in the learning process. In addition, interactions can reflect the student behaviours in specific learning environments and can be used as identifications of the laboratory content as the interactions changed with regard to the laboratory context. This is demonstrated by some studies which assessed the effectiveness of curriculum reform from the viewpoint of interactions.

In face-to-face laboratories, Student-Equipment Interactions and Indirect Interactions have mainly been neglected as compared with the two types of interpersonal interactions (Student-Student and Student-Instructor). Except for one article (Wei et al., 2018), no studies in science laboratories were found that included these types of interactions. It is suggested for curriculum designers and instructors to be aware of the importance of these two types of interactions (S-E and I-I).

There are fewer studies of interactions in remote laboratories compared with their face-to-face counterparts and there is a significant lack of systematic observation instruments and associated findings of interactions in remote laboratories. However, since the learning process can be reflected by interactions, it is necessary to investigate them. The current methods for studying interactions already used in face-to-face laboratories may be applied to study remote laboratories. For example, in some of the remote laboratories, systematic observation instruments can be implemented to analyse the learner’s interactions if possible. Furthermore, detailed analysis of groups manipulating remote laboratories can be conducted by referring to previous methods and theories which have been used for face-to-face laboratories. In one study, an attempt was made to analyse the factors of group conflict and cultural diversity within virtual teams (Paul & Ray, 2013). Although this was not directly made with students performing remote laboratories, it could be possible guidance for the studies in remote laboratories. Another factor that should be taken into consideration is that face-to-face and remote laboratories can serve different learning objectives. In face-to-face laboratories, the emphasis has been put on design skills development, while remote laboratories were effective in teaching concepts (Ma & Nickerson, 2006). Therefore, in analysing the characteristics and functions of interactions in science laboratories, one should also consider the learning objectives, and laboratory types, not just the form of changes of interactions due to the variance of technology. The classical paper of Nickerson et al. (2007) provided a systematic evaluation method for the effectiveness of remote laboratories. It is also suggested that studies in distance education be referred to for researchers of remote laboratories because they have two similar characteristics: a physical separation between learners and other people, and technology being used to reduce this separation. Because of the physical distance between learner and equipment in remote laboratories, some students felt that they were not intimately involved in the laboratory process (Lowe et al., 2012). This problem may be overcome by adopting ideas from simulated laboratories, increasing learners’ feelings of presence with the help of advanced technologies (Orduña et al., 2015). However, it is not easy to achieve, especially in science laboratories, as this requires remote science-laboratory designers, to also have high-level computer skills. Thus, different institutions or people from various areas with complementary expertise should work together to solve this problem. After all, there have been multiple successful examples from cross-institutional collaboration in the area of remote laboratories (Orduña, Almeida, López-De-Ipiña, & Garcia-Zubia, 2014).

In remote laboratories, Student-Equipment is the most prominent type of interaction. The interaction between student and interface can be regarded as the difference between technicians and professionals, the technician doing what he or she was trained to manipulate while the professionals were always thinking about what and why they were doing the task this way and not in a different way (Biggs, 2011). In remote laboratories, because they are provided with more freedom and time, students sometimes can be independent thinkers and learners. They can also control their speed of learning and think at a more complex level. In this self-monitored process, human-computer interactions can be defining factors in the learning process. Therefore, studies about human-computer interactions may be related to the understanding of interactions in remote laboratories (Volpentesta, 2015). One example is the finding that the frequency of using learning aids was increased when the on-screen access or prompt to such aids was clearly visible but also non-intrusive (Ruf & Ploetzner, 2014).

Educationalists have their own preferences for data collection and data analysis relating to science laboratories. Qualitative methods are believed to be beneficial for gaining deep insights to learning and teaching in the classroom and hence provide opportunities to understand the learning-related behaviours and/or motivations arising from the interactions (Cole, Becker, & Stanford, 2014; Gee & Green, 1998). However, qualitative studies have been constrained to limited numbers of participants and the results are consequently hard to generalise. From this aspect, quantitative methods have their advantages in their characteristics of easy-to-conduct and yield a broader view. The commonly used process-product paradigm has enriched people’s knowledge in student and instructor’s performance and the relationship between them with an individual’s learning achievements (Fraser, Giddings, & McRobbie, 1992). It seems that the best way is to combine qualitative and quantitative methods, for example, to collect data quantitatively and explain the results qualitatively (Hofstein, Levy Nahum, & Shore, 2001; Tobin & Fraser, 1998). However, the research goals and focus questions are the most important aspects to be considered. Thus, whether to use quantitative or qualitative methods is not the central question. What matters is the identification of the problems to be addressed because the methods are just the tools used to develop their solution.

Concluding remarks

After the critical review of Power (1977) pertaining to interactions in science classrooms, until around 1996 (as shown in Interactions in face-to-face science laboratories section) there were few studies on this topic. Since 1996 there has been an increased interest in laboratory interactions, though the general principles of methodologies in face-to-face science laboratories have continued in a similar vein. On the one hand, a range of new instruments/surveys have been developed and direct observations have been conducted to capture and describe the interactions that occurred in the classroom in greater detail than previously, and by using advanced technologies. On the other hand, to gain a better understanding of the nature of verbal and non-verbal actions that takes place among and between students, instructors, and equipment, educationalists have analysed how interactions signify students’ perceptions of the learning process. As for remote laboratories, approaches to studying interactions in laboratory activities are not well-developed and to date there are no systematic studies of interactions in science undergraduate remote laboratories. The teaching laboratory is a complex environment where it is hard to record every event and evaluate its importance. However, many studies have illustrated the fact that even only an investigatory glimpse of the interactions can yield valuable insights into the learning process and a deep understanding of the obstacles that students encounter. This is clearly an opportunity for future research on interactions in both face-to-face and, more pressingly, remote science laboratories.

Notes

Acknowledgements

The authors would like to thank our colleague Senior Lecturer Mihye Won, Curtin University, for invaluable discussions when the first author was a student in her class in thesis preparation.

Authors’ contributions

JW conducted the research project and drafted the manuscript. DFT, ADL and MM made contributions to the conception and design of the study. MGZ and EDL read and critically commented on each draft of the paper as the research progressed, providing important intellectual content. All authors read and approved the final manuscript.

Funding

This research is supported by a government grant (details on publication) the Australian Research Council Discovery Grant (DP140104189) entitled: The online future of Science and Engineering Education: The essential elements of laboratory-based learning for remote-access implementation.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

References

  1. Ajaja, P. O. (2013). Coding and analysing behaviour strategies of instructors in university science laboratories to improve science teachers training. International Education Studies, 6(1), 63–73.  https://doi.org/10.5539/ies.v6n1p63.CrossRefGoogle Scholar
  2. Aktan, B., Bohus, C. A., Crowl, L. A., & Shor, M. H. (1996). Distance learning applied to control engineering laboratories. IEEE Transactions on Education, 39(3), 320–326.  https://doi.org/10.1109/13.538754.CrossRefGoogle Scholar
  3. American Association for the Advancement of Science (2013). Describing and measuring undergraduate STEM teaching practices. Washington, DC: https://cgsnet.org/describing-and-measuring-undergraduate-stem-teaching-practices. Retrieved 25 Sept 2019.
  4. Biggs, J. B. (2011). Teaching for quality learning at university: What the student does. Maidenhead: McGraw-Hill Education (UK).Google Scholar
  5. Blickenstaff, J. C. (2010). A framework for understanding physics instruction in secondary and college courses. Research Papers in Education, 25(2), 177–200.CrossRefGoogle Scholar
  6. Bloome, D. (1989). Locating the learning of reading and writing in classrooms: Beyond deficit, difference, and effectiveness models. In Locating learning: Ethnographic perspectives on classroom research, (pp. 87–114).Google Scholar
  7. Böhne, A., Faltin, N., & Wagner, B. (2007). Distributed group work in a remote programming laboratory - a comparative study. International Journal of Engineering Education, 23(1), 162–170.Google Scholar
  8. Böhne, A., Rütters, K., & Wagner, B. (2004). Evaluation of tele-tutorial support in a remote programming laboratory. In Paper presented at the 2004 American Society for Engineering Education Annual Conference and Exposition, Salt Lake City, Utah https://peer.asee.org/13115. Retrieved 25 Sept 2019.Google Scholar
  9. Botero, M. L., Selmer, A., Watson, R., Bansal, M., & Kraft, M. (2016). Cambridge weblabs: A process control system using industrial standard SIMATIC PCS 7. Education for Chemical Engineers, 16, 1–8.  https://doi.org/10.1016/j.ece.2016.04.001.CrossRefGoogle Scholar
  10. Bowers, C. A., & Flinders, D. (1990). Responsive teaching: An ecological approach to classroom patterns of language, culture, and thought. New York: Teachers College Press.Google Scholar
  11. Bright, C., Lindsay, E., Lowe, D., Murray, S., & Liu, D. (2008). Factors that impact learning outcomes in remote laboratories. In Paper presented at the Ed-Media 2008: World Conference on Educational Multimedia, Hypermedia And Telecommunications, Vienna, Austria.Google Scholar
  12. Brinson, J. R. (2015). Learning outcome achievement in non-traditional (virtual and remote) versus traditional (hands-on) laboratories: A review of the empirical research. Computers & Education, 87, 218–237.  https://doi.org/10.1016/j.compedu.2015.07.003.CrossRefGoogle Scholar
  13. Cohen, D. K., & Ball, D. L. (1999). Instruction, capacity, and improvement (CPRE-RR-43). Philadelphia: https://www.cpre.org/sites/default/files/researchreport/783_rr43.pdf. Retrieved 26 Sept 2019.
  14. Cole, M., & Engeström, Y. (1993). A cultural-historical approach to distributed cognition. In G. Salomon (Ed.), Distributed cognitions: Psychological and educational considerations, (pp. 1–46). New York: Cambridge University Press.Google Scholar
  15. Cole, R. S., Becker, N., & Stanford, C. (2014). Discourse analysis as a tool to examine teaching and learning in the classroom. In Tools of chemistry education research, (vol. 1166, pp. 61–81). New York: Oxford University Press.Google Scholar
  16. Cooper, M., & Ferreira, J. M. M. (2009). Remote laboratories extending access to science and engineering curricular. IEEE Transactions on Learning Technologies, 2(4), 342–353.  https://doi.org/10.1109/tlt.2009.43.CrossRefGoogle Scholar
  17. Corter, J. E., Esche, S. K., Chassapis, C., Ma, J., & Nickerson, J. V. (2011). Process and learning outcomes from remotely-operated, simulated, and hands-on student laboratories. Computers & Education, 57(3), 2054–2067.  https://doi.org/10.1016/j.compedu.2011.04.009.CrossRefGoogle Scholar
  18. De Jong, T. (2006). Technological advances in inquiry learning. Science, 312(5773), 532–533.  https://doi.org/10.1126/science.1127750.CrossRefGoogle Scholar
  19. De Jong, T., Linn, M. C., & Zacharia, Z. C. (2013). Physical and virtual laboratories in science and engineering education. Science, 340(6130), 305–308.  https://doi.org/10.1126/science.1230579.CrossRefGoogle Scholar
  20. de la Torre, L., Guinaldo, M., Heradio, R., & Dormido, S. (2015). The ball and beam system: A case study of virtual and remote lab enhancement with Moodle. IEEE Transactions on Industrial Informatics, 11(4), 934–945.  https://doi.org/10.1109/tii.2015.2443721.CrossRefGoogle Scholar
  21. de la Torre, L., Heradio, R., Jara, C. A., Sanchez, J., Dormido, S., Torres, F., & Candelas, F. A. (2013). Providing collaborative support to virtual and remote laboratories. IEEE Transactions on Learning Technologies, 6(4), 312–323.  https://doi.org/10.1109/TLT.2013.20.CrossRefGoogle Scholar
  22. DiSessa, A. A. (2001). Changing minds: Computers, learning, and literacy. Cambridge: MIT Press.Google Scholar
  23. Faulconer, E. K., & Gruss, A. B. (2018). A review to weigh the pros and cons of online, remote, and distance science laboratory experiences. The International Review of Research in Open and Distributed Learning, 19(2).  https://doi.org/10.19173/irrodl.v19i2.3386.
  24. Fennema, E., & Sherman, J. A. (1976). Fennema-Sherman mathematics attitudes scales: Instruments designed to measure attitudes toward the learning of mathematics by females and males. Journal for Research in Mathematics Education, 7(5), 324–326.  https://doi.org/10.2307/748467.CrossRefGoogle Scholar
  25. Flaherty, A., O’Dwyer, A., Mannix-McNamara, P., & Leahy, J. J. (2017). Evaluating the impact of the “Teaching as a chemistry laboratory graduate teaching assistant” program on cognitive and psychomotor verbal interactions in the laboratory. Journal of Chemical Education, 94(12), 1831–1843.  https://doi.org/10.1021/acs.jchemed.7b00370.CrossRefGoogle Scholar
  26. Fraser, B. J., Giddings, G. J., & McRobbie, C. J. (1992). Assessment of the psychosocial environment of university science laboratory classrooms: A crossnational study. Higher Education, 24(4), 431–451.  https://doi.org/10.1007/bf00137241.CrossRefGoogle Scholar
  27. Garcia, P. A. (2002). Interaction, distributed cognition and web-based learning. In Paper presented at the E-Learn: world conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2002, Montreal, Canada https://www.learntechlib.org/p/9371. Retrieved 26 Sept 2019.Google Scholar
  28. Gee, J. P., & Green, J. L. (1998). Discourse analysis, learning, and social practice: A methodological study. Review of Research in Education, 23, 119–169.  https://doi.org/10.2307/1167289.CrossRefGoogle Scholar
  29. Glaser, B. G. (1965). The constant comparative method of qualitative analysis. Social Problems, 12(4), 436–445.  https://doi.org/10.2307/798843.CrossRefGoogle Scholar
  30. Good, J., Colthorpe, K., Zimbardi, K., & Kafer, G. (2015). Research and teaching: The roles of mentoring and motivation in student teaching assistant interactions and in improving experience in first-year biology laboratory classes. Journal of College Science Teaching, 44(4), 1–11.  https://doi.org/10.2505/4/jcst15_044_04_88.CrossRefGoogle Scholar
  31. Gresser, P. W. (2006). A study of social interaction and teamwork in reformed physics laboratories. (Doctoral dissertation), University of Maryland https://drum.lib.umd.edu/handle/1903/3362. Retrieved 26 Sept 2019. Available from ProQuest Dissertations & Theses Global.
  32. Henry, J. (2000). 24× 7: Lab experiments access on the web all the time. In Paper presented at the ASEE Annual Conference Proceedings, St. Louis, Missouri https://peer.asee.org/8147. Retrieved 26 Sept 2019.Google Scholar
  33. Herrington, D. G., & Nakhleh, M. B. (2003). What defines effective chemistry laboratory instruction? teaching assistant and student perspectives. Journal of Chemical Education, 80(10), 1197.  https://doi.org/10.1021/ed080p1197.CrossRefGoogle Scholar
  34. Hillman, D. C. A., Willis, D. J., & Gunawardena, C. N. (1994). Learner-interface interaction in distance education: An extension of contemporary models and strategies for practitioners. American Journal of Distance Education, 8(2), 30–42.  https://doi.org/10.1080/08923649409526853.CrossRefGoogle Scholar
  35. Hilosky, A., Sutman, F., & Schmuckler, J. (1998). Is laboratory based instruction in beginning college-level chemistry worth the effort and expense? Journal of Chemical Education, 75(1), 100.  https://doi.org/10.1021/ed075p100.CrossRefGoogle Scholar
  36. Hofstein, A., Levy Nahum, T., & Shore, R. (2001). Assessment of the learning environment of inquiry-type laboratories in high school chemistry. Learning Environments Research, 4(2), 193–207.  https://doi.org/10.1023/a:1012467417645.CrossRefGoogle Scholar
  37. Hofstein, A., & Lunetta, V. N. (2004). The laboratory in science education: Foundations for the twenty-first century. Science Education, 88(1), 28–54.  https://doi.org/10.1002/sce.10106.CrossRefGoogle Scholar
  38. Högström, P., Ottander, C., & Benckert, S. (2010). Lab work and learning in secondary school chemistry: The importance of teacher and student interaction. Research in Science Education, 40(4), 505–523.  https://doi.org/10.1007/s11165-009-9131-3.CrossRefGoogle Scholar
  39. Johnstone, A., & Al-Shuaili, A. (2001). Learning in the laboratory: Some thoughts from the literature. University Chemistry Education, 5(2), 42–51.Google Scholar
  40. Kamruzzaman, M. M., Wang, M., Jiang, H., He, W., & Liu, X. (2015). A web-based remote laboratory for the college of optoelectronic engineering of online universities. In Paper presented at the 2015 Optoelectronics Global Conference (OGC) https://ieeexplore.ieee.org/document/7336830. Retrieved 26 Sept 2019.Google Scholar
  41. Kiboss, J. K. (1997). An evaluation of teacher/student verbal and non-verbal behaviours in computer augmented physics laboratory classrooms in Kenya. African Journal of Research in Mathematics, Science and Technology Education, 1(1), 65–76.  https://doi.org/10.1080/10288457.1997.10756089.CrossRefGoogle Scholar
  42. Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs: Prentice-Hall.Google Scholar
  43. Komorek, M., & Kattmann, U. (2008). The model of educational reconstruction. In U. R. Silke Mikelskis-Seifert, & M. Brückmann (Eds.), Four decades of research in science education–from curriculum development to quality improvement, (pp. 171–188). Münster: Waxmann.Google Scholar
  44. Krystyniak, R. A., & Heikkinen, H. W. (2007). Analysis of verbal interactions during an extended, open-inquiry general chemistry laboratory investigation. Journal of Research in Science Teaching, 44(8), 1160–1186.  https://doi.org/10.1002/tea.20218.CrossRefGoogle Scholar
  45. Kumpulainen, K., & Mutanen, M. (1999). The situated dynamics of peer group interaction: An introduction to an analytic framework. Learning and Instruction, 9(5), 449–473.  https://doi.org/10.1016/S0959-4752(98)00038-3.CrossRefGoogle Scholar
  46. Kyle, W. C., Penick, J. E., & Shymansky, J. A. (1979). Assessing and analyzing the performance of students in college science laboratories. Journal of Research in Science Teaching, 16(6), 545–551.  https://doi.org/10.1002/tea.3660160608.CrossRefGoogle Scholar
  47. Le Roux, G. A., Reis, G. B., de Jesus, C. D., Giordano, R. C., Cruz, A. J., Moreira, P. F., … Loureiro, L. V. (2009). Cooperative Weblab: A tool for cooperative learning in chemical engineering in a global environment. Computer Aided Chemical Engineering, 27, 2139–2144.  https://doi.org/10.1016/S1570-7946(09)70747-3.CrossRefGoogle Scholar
  48. Lehman, J. R. (1990). Students’ verbal interactions during chemistry laboratories. School Science and Mathematics, 90(2), 142–150.  https://doi.org/10.1111/j.1949-8594.1990.tb12006.x.CrossRefGoogle Scholar
  49. Lindsay, E. D., & Good, M. C. (2005). Effects of laboratory access modes upon learning outcomes. IEEE Transactions on Education, 48(4), 619–631.  https://doi.org/10.1109/TE.2005.852591.CrossRefGoogle Scholar
  50. Lindsay, E. D., Naidu, S., & Good, M. C. (2007). A different kind of difference: Theoretical implications of using technology to overcome separation in remote laboratories. International Journal of Engineering Education, 23(4), 772–779.  https://doi.org/10.3189/172756499781821562.CrossRefGoogle Scholar
  51. Lowe, D., Newcombe, P., & Stumpers, B. (2012). Evaluation of the use of remote laboratories for secondary school science education. Research in Science Education, 43(3), 1197–1219.  https://doi.org/10.1007/s11165-012-9304-3.CrossRefGoogle Scholar
  52. Lund, T. J., Pilarz, M., Velasco, J. B., Chakraverty, D., Rosploch, K., Undersander, M., & Stains, M. (2015). The best of both worlds: Building on the COPUS and RTOP observation protocols to easily and reliably measure various levels of reformed instructional practice. CBE-Life Sciences Education, 14(2), ar18.  https://doi.org/10.1187/cbe.14-10-0168.CrossRefGoogle Scholar
  53. Ma, J., & Nickerson, J. V. (2006). Hands-on, simulated, and remote laboratories: A comparative literature review. ACM Computing Surveys (CSUR), 38(3), 7.CrossRefGoogle Scholar
  54. Mercer, N., Littleton, K., & Wegerif, R. (2004). Methods for studying the processes of interaction and collaborative activity in computer-based educational activities. Technology, Pedagogy and Education, 13(2), 195–212.  https://doi.org/10.1080/14759390400200180.CrossRefGoogle Scholar
  55. Miller, K., Brickman, P., & Oliver, J. S. (2014). Enhancing teaching assistants’ (TAs’) inquiry teaching by means of teaching observations and reflective discourse. School Science and Mathematics, 114(4), 178–190.  https://doi.org/10.1111/ssm.12065.CrossRefGoogle Scholar
  56. Moore, M. G. (1989). Editorial: Three types of interaction. American Journal of Distance Education, 3(2), 1–7.  https://doi.org/10.1080/08923648909526659.CrossRefGoogle Scholar
  57. Nakhleh, M. B., Polles, J., & Malina, E. (2002). Learning chemistry in a laboratory environment. In J. K. Gilbert, O. De Jong, R. Justi, D. F. Treagust, & J. H. Van Driel (Eds.), Chemical education: Towards research-based practice, (pp. 69–94). New York: Springer.Google Scholar
  58. Nickerson, J. V., Corter, J. E., Esche, S. K., & Chassapis, C. (2007). A model for evaluating the effectiveness of remote engineering laboratories and simulations in education. Computers & Education, 49(3), 708–725.  https://doi.org/10.1016/j.compedu.2005.11.019.CrossRefGoogle Scholar
  59. Ocumpaugh, J. (2015). Baker Rodrigo Ocumpaugh monitoring protocol (BROMP) 2.0 technical and training manual. http://penoy.admu.edu.ph/~alls/wp-content/uploads/2015/02/BROMP_2.0_Final-libre.pdf. Retrieved 27 Sept 2019.Google Scholar
  60. Ogot, M., Elliott, G., & Glumac, N. (2003). An assessment of in-person and remotely operated laboratories. Journal of Engineering Education, 92(1), 57.  https://doi.org/10.1002/j.2168-9830.2003.tb00738.x.CrossRefGoogle Scholar
  61. Ogunniyi, M., & Ramorogo, G. (1994). Relative effects of a micro-teaching programme on pre-service science teachers’ classroom behaviours. Southern African Journal of Mathematics and Science Education, 1(2), 25–36.Google Scholar
  62. Okebukola, P. A. (1984). In search of a more effective interaction pattern in biology laboratories. Journal of Biological Education, 18(4), 305–308.  https://doi.org/10.1080/00219266.1984.9654661.CrossRefGoogle Scholar
  63. Oliveira, A. W., & Sadler, T. D. (2008). Interactive patterns and conceptual convergence during student collaborations in science. Journal of Research in Science Teaching, 45(5), 634–658.  https://doi.org/10.1002/tea.20211.CrossRefGoogle Scholar
  64. Orduña, P., Almeida, A., López-De-Ipiña, D., & Garcia-Zubia, J. (2014). Learning analytics on federated remote laboratories: Tips and techniques. In Paper presented at the 2014 IEEE Global Engineering Education Conference (EDUCON) https://ieeexplore.ieee.org/document/6826107. Retrieved 27 Sept 2019.Google Scholar
  65. Orduña, P., Garbi Zutin, D., Govaerts, S., Lequerica Zorrozua, I., Bailey, P. H., Sancristobal, E., … Garcia-Zubia, J. (2015). An extensible architecture for the integration of remote and virtual laboratories in public learning tools. IEEE Journal of Latin-American Learning Technologies (IEEE-RITA), 10(4), 223–233.  https://doi.org/10.1109/RITA.2015.2486338.CrossRefGoogle Scholar
  66. Paul, S., & Ray, S. (2013). Cultural diversity, group interaction, communication convergence, and intra-group conflict in global virtual teams: Findings from a laboratory experiment. In Paper presented at the 2013 46th Hawaii International Conference on System Sciences, Wailea, Maui, HI, USA https://ieeexplore.ieee.org/document/6479876. Retrieved 27 Sept 2019.Google Scholar
  67. Potvin, P., & Hasni, A. (2014). Interest, motivation and attitude towards science and technology at K-12 levels: A systematic review of 12 years of educational research. Studies in Science Education, 50(1), 85–129.  https://doi.org/10.1080/03057267.2014.881626.CrossRefGoogle Scholar
  68. Poulsen, C., Kouros, C., d'Apollonia, S., Abrami, P. C., Chambers, B., & Howe, N. (1995). A comparison of two approaches for observing cooperative group work. Educational Research and Evaluation, 1(2), 159–182.  https://doi.org/10.1080/1380361950010203.CrossRefGoogle Scholar
  69. Power, C. (1977). A critical review of science classroom interaction studies. Studies in Science Education, 4(1), 1–30.  https://doi.org/10.1080/03057267708559844.CrossRefGoogle Scholar
  70. Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., … Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences, 13(3), 337–386.  https://doi.org/10.1207/s15327809jls1303_4.CrossRefGoogle Scholar
  71. Roychoudhury, A., & Roth, W.-M. (1996). Interactions in an open-inquiry physics laboratory. International Journal of Science Education, 18(4), 423–445.  https://doi.org/10.1080/0950069960180403.CrossRefGoogle Scholar
  72. Ruf, T., & Ploetzner, R. (2014). One click away is too far! How the presentation of cognitive learning aids influences their use in multimedia learning environments. Computers in Human Behavior, 38, 229–239.  https://doi.org/10.1016/j.chb.2014.06.002.CrossRefGoogle Scholar
  73. Rybczynski, S. M., & Schussler, E. E. (2013). Effects of instructional model on student attitude in an introductory biology laboratory. International Journal for the Scholarship of Teaching and Learning, 7(2), n2.  https://doi.org/10.20429/ijsotl.2013.070222.CrossRefGoogle Scholar
  74. Sadler, T. D., Puig, A., & Trutschel, B. K. (2011). Laboratory instructional practices inventory: A tool for assessing the transformation of undergraduate laboratory instruction. Journal of College Science Teaching, 41(1), 25–31.Google Scholar
  75. Sauter, M., Uttal, D. H., Rapp, D. N., Downing, M., & Jona, K. (2013). Getting real: The authenticity of remote labs and simulations for science learning. Distance Education, 34(1), 37–47.  https://doi.org/10.1080/01587919.2013.770431.CrossRefGoogle Scholar
  76. Saxena, S., & Satsangee, S. P. (2014). Offering remotely triggered, real-time experiments in electrochemistry for distance learners. Journal of Chemical Education, 91(3), 368–373.  https://doi.org/10.1021/ed300349t.CrossRefGoogle Scholar
  77. Scalise, K., Timms, M., Moorjani, A., Clark, L., Holtermann, K., & Irvin, P. S. (2011). Student learning in science simulations: Design features that promote learning gains. Journal of Research in Science Teaching, 48(9), 1050–1078.CrossRefGoogle Scholar
  78. Scanlon, E., Morris, E., Di Paolo, T., & Cooper, M. (2002). Contemporary approaches to learning science: Technologically-mediated practical work. Studies in Science Education, 38(1), 73–114.  https://doi.org/10.1080/03057260208560188.CrossRefGoogle Scholar
  79. Scheucher, B., Bayley, P., Gütl, C., & Harward, J. (2009). Collaborative virtual 3D environment for internet-accessible physics experiments. International Journal of Online Engineering, 5(REV 2009), 65–71.  https://doi.org/10.3991/ijoe.v5s1.1014.CrossRefGoogle Scholar
  80. Stang, J. B., Barker, M., Perez, S., Ives, J., & Roll, I. (2016). Active learning in pre-class assignments: Exploring the use of interactive simulations to enhance reading assignments. In Paper presented at the Physics Education Research Conference Proceedings, Sacramento, CA https://www.per-central.org/items/detail.cfm?ID=14263. Retrieved 27 Sept 2019.Google Scholar
  81. Stang, J. B., & Roll, I. (2014). Interactions between teaching assistants and students boost engagement in physics labs. Physical Review Special Topics - Physics Education Research, 10(2), 020117.  https://doi.org/10.1103/PhysRevSTPER.10.020117.CrossRefGoogle Scholar
  82. Sutton, L. A. (2001). The principle of vicarious interaction in computer-mediated communications. International Journal of Educational Telecommunications, 7(3), 223–242 https://www.learntechlib.org/primary/p/9534/. Retrieved 19 Sept 2019.Google Scholar
  83. Tamir, P., Nussinovitz, R., & Friedler, Y. (1982). The design and use of a practical tests assessment inventory. Journal of Biological Education, 16(1), 42–50.  https://doi.org/10.1080/00219266.1982.9654417.CrossRefGoogle Scholar
  84. Tirado-Morueta, R., Sánchez-Herrera, R., Márquez-Sánchez, M. A., Mejías-Borrero, A., & Andujar-Márquez, J. M. (2018). Exploratory study of the acceptance of two individual practical classes with remote labs. European Journal of Engineering Education, 43(2), 278–295.  https://doi.org/10.1080/03043797.2017.1363719.CrossRefGoogle Scholar
  85. Tobin, K., & Fraser, B. J. (1998). Qualitative and quantitative landscapes of classroom learning environments. In B. J. Fraser, & K. G. Tobin (Eds.), International handbook of science education, (vol. 1, pp. 623–640). Dordrecht: Kluwer.CrossRefGoogle Scholar
  86. Velasco, J. B., Knedeisen, A., Xue, D., Vickrey, T. L., Abebe, M., & Stains, M. (2016). Characterizing instructional practices in the laboratory: The laboratory observation protocol for undergraduate STEM. Journal of Chemical Education, 93(7), 1191–1203.  https://doi.org/10.1021/acs.jchemed.6b00062.CrossRefGoogle Scholar
  87. Volpentesta, A. P. (2015). A framework for human interaction with mobiquitous services in a smart environment. Computers in Human Behavior, 50, 177–185.  https://doi.org/10.1016/j.chb.2015.04.003.CrossRefGoogle Scholar
  88. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. In M. Cole, V. John-Steiner, & E. Souberman (Eds.). Cambridge: Harvard University Press.Google Scholar
  89. Wei, J., Mocerino, M., Treagust, D. F., Lucey, A. D., Zadnik, M. G., Lindsay, E. D., & Carter, D. J. (2018). Developing an understanding of undergraduate student interactions in chemistry laboratories. Chemistry Education Research and Practice, 19(4), 1186–1198.  https://doi.org/10.1039/C8RP00104A.CrossRefGoogle Scholar
  90. West, E. A., Paul, C. A., Webb, D., & Potter, W. H. (2013). Variation of instructor-student interactions in an introductory interactive physics course. Physical Review Special Topics - Physics Education Research, 9(1), 010109.  https://doi.org/10.1103/PhysRevSTPER.9.010109.CrossRefGoogle Scholar
  91. Wubbels, T. (1993). Teacher-student relationships in science and mathematics classes. In B. J. Fraser (Ed.), Research implications for science and mathematics teachers, (vol. 1, pp. 65–73). Perth: Curtin University of Technology.Google Scholar
  92. Xu, H., & Talanquer, V. (2013). Effect of the level of inquiry on student interactions in chemistry laboratories. Journal of Chemical Education, 90(1), 29–36.  https://doi.org/10.1021/ed3002946.CrossRefGoogle Scholar
  93. Zacharia, Z. C., Manoli, C., Xenofontos, N., de Jong, T., Pedaste, M., van Riesen, S. A., … Tsourlidaki, E. (2015). Identifying potential types of guidance for supporting student inquiry when using virtual and remote labs in science: A literature review. Educational Technology Research and Development, 63(2), 257–302.  https://doi.org/10.1007/s11423-015-9370-0.CrossRefGoogle Scholar

Copyright information

© The Author(s) 2019

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  1. 1.School of Molecular and Life SciencesCurtin UniversityPerthAustralia
  2. 2.School of EducationCurtin UniversityPerthAustralia
  3. 3.School of Civil and Mechanical EngineeringCurtin UniversityPerthAustralia
  4. 4.School of Electrical Engineering, Computing and Mathematical SciencesCurtin UniversityPerthAustralia
  5. 5.CSU of EngineeringCharles Sturt UniversityBathurstAustralia

Personalised recommendations