Advertisement

Analytical and Bioanalytical Chemistry

, Volume 410, Issue 6, pp 1609–1613 | Cite as

Smartphones as audience response systems for lectures and seminars

  • Reiner Salzer
ABCs of Education and Professional Development in Analytical Science

Might smartphones really help to increase attention during a lecture if they are used as an immediate response system? Benefits and challenges of audience response systems (ARSs) in lectures and seminars have been discussed for years. The previous requirement of acquiring dedicated hardware has been overcome by the widespread availability of smartphones and the ubiquity of Internet access.

Mobile phone technology is familiar to students. Internet connection via a wireless local area network (WLAN) is free of charge. The software is usually located on a remote server (Software as a Service, SaaS); no installation is required on devices in the lecture hall, and the use is platform independent. Some experience of using ARSs in science education is summarised here. If properly applied, ARSs increase attention and collaboration of students, improve interaction between students and lecturers, and permit monitoring of the progress of learning in various environments.

Multitasking abilities

Knowledge cannot be transferred to a passive student. Each individual learner must in her/his mind associate new information with previous knowledge [1]. It is well known that most people cannot concentrate for extended periods beyond about 20 min [2], afterwards they become passive. Quizzes using ARSs help to regain the attention of learners. This kind of active learning is considered a good way to learn because it not only helps students to review the course material but also helps them to enjoy learning about a topic [3].

Current discussions about educational policy and practice are often embedded in a mindset that considers students who were born in an age of omnipresent digital media to be fundamentally different from previous generations of students [4]. These students are labelled digital natives, and those born before the 1980s are termed digital immigrants. In fact, the digital native is a myth. It was found that digital natives were not always more digitally oriented than the so-called digital immigrants (http://www.ascilite.org/conferences/auckland09/procs/mcnaught.pdf).

A second myth deals with the claimed ability of young people (the homo zappiëns) to multitask, that they are even experts at it, and that education should adapt to it [4]. It was, however, found that students who while studying surf the Web and update their Facebook pages and follow other users on Facebook for information both related and unrelated to the class have depressed final average grades [5]. In addition, the multitasking behaviour of one student during a lecture may negatively affect the learning of other students [6].

Is the nonexistence of human multitasking ability a sufficient reason to ban smartphones absolutely from classes? Looking beyond classrooms, the missing human multitasking ability is the reason why some functions of smartphones must not be used, for example, while driving a car. Other functions of smartphones may be very helpful while driving, and hence smartphones are not generally banned from being used in a car. The same situation applies for lecture rooms: smartphones or computers may be used adequately or inadequately, and hence their use should be regulated appropriately and should not be banned. In the following, we will discuss how smartphones can be used to the good during classes.

Didactic considerations

ARSs are used for a variety of reasons, such as collecting data and engaging the audience in a presentation by asking a question (Fig. 1). As students submit their answers, responses are displayed as a graph (Fig. 2) and are continually updated, showing the total number of students or groups choosing each answer. After a short time, the teacher reveals the right answer, and depending on group responses (Fig. 1) moves on with the lecture if correct answers dominate or there are only minor ambiguities. Otherwise, the teacher reviews the material just covered or initiates a peer discussion among students. Interactivity and general learning outcomes are influenced by the instructor’s pedagogy and strategic use of the ARS [8].
Fig. 1

Typical operational scenario of an audience response system (adapted from [7])

Fig. 2

Example of the reception of an audience response system (ARS) by master’s degree students of analytical chemistry. The single-choice scheme was selected for both questions. The images are screenshots of the PowerPoint display during the lecture immediately after the voting had been completed

From an instructor’s viewpoint, ARSs can be beneficial for a number of tasks, such as peer-learning activities, gathering real-time feedback on students’ understanding of lecture material, identifying students’ misconceptions about content, and enabling the instructor to adapt lectures to address those misconceptions. To achieve these goals, we have to remind ourselves that ARSs are only a tool, and that they must be used appropriately to achieve the desired results.

Two didactic concepts have gained particular importance in recent years: Mazur’s Peer Instruction method [9] and Dufresne’s Class-Wide Discussion [10]. These methods are based on different discussion sequences. Which sequence is considered best for particular teaching circumstances may be deduced from a study involving engineering students [11].

Good feedback practice plays an important role in helping to develop the students’ capacity to self-regulate their own performance, which is a key part of the concept of student-centred learning [12]. The seven principles of good feedback practice [13] include the need (1) to clarify what is expected of the student, (2) to provide the opportunity for the student to close the gap between current and desired performance, and (3) to give the teacher information about the status of the student’s learning.

Choice of ARSs

ARSs are referred to by an assortment of names, including personal response stations, interactive voting systems, class response systems, electronic voting systems, student response systems, interactive student response systems, group response systems, group process support systems, and the more colloquial term clickers.

The important difference between ARSs and learning management systems (LMSs) is the availability of ARSs to students only during classes, whereas common LMSs are available outside classes as well. A summary of common LMSs is found in (http://www.capterra.com/learning-management-system-software/).

The first ARSs were developed by the US Air Force for continuing education. Three generations of ARSs have appeared since then. Initially, such systems were hardwired and very expensive but did not work reliably. Around the turn of the millennium, dedicated mobile input devices became available at much lower costs and were much more reliable. These first two generations are termed hardware-based systems [8]. The increasing commodity of mobile Web-enabled devices (smartphone, tablet, or laptop) and the expansion of WLAN into lecture halls and seminar rooms initiated the most recent step of development towards the current generation of ARSs, which are called software-based systems.

Apart from students’ mobile devices, software-based ARSs require for the lecturer only live Internet access as well as a standard computer and projector. Such systems do not require technical preparations and do not require extra maintenance. All participants use their own hardware (smartphone, tablet, laptop), with which they are familiar. With the global spread of Eduroam (https://www.eduroam.org/), foreign participants in a class may even use their home login. This is a great advantage compared with systems with dedicated mobile units (clickers).

Determining the reasons for acquiring an ARS is important since those reasons may be the guiding factor in vendor selection. All software-based ARSs provide single-choice schemes for responses, and most also provide multiple-choice schemes. Some systems allow individual assessment, some permit live feedback. Most systems run without prior installation on a student’s smartphone, and many run even without installation on a lecturer’s computers. To protect students’ data and intellectual property rights, usually registration on the remote server is required for teachers before the formulation of the questions. Some systems permit convenient input of the server address by the scanning of QR codes (Fig. 2). Some systems can be used free of charge. Assessments (both low stakes and high stakes), attendance checks, and learning progress of individual students can be valued only if non-anonymous voting systems are used. Focused overviews of selected products can be found in (https://cft.vanderbilt.edu/cft/docs/classroom-response-system-clickers-bibliography/) (https://library.educause.edu/topics/teaching-and-learning/clickers) (http://ep.elan-ev.de/wiki/Audience_Response).

Benefits and challenges

Software-based systems are usually easy to use, and they work reliably. They can be afforded by small and large groups of students. Even when there are very large numbers of attendees, these systems provide immediate responses.

ARSs may be very helpful in the development of cooperative learning environments beyond a simple transmission of content. Some systems permit poll results to be displayed immediately within a PowerPoint presentation (Fig. 2). This kind of feedback practice plays an important role in helping to develop the students’ capacity to self-regulate their own performance, which is a key part of the concept of student-centred learning.

Multiple studies have found that attendance improves when an ARS is used, provided it is linked to a portion of a student’s final mark. An extensive discussion of benefits and challenges of using ARSs is given in [14].

A persisting challenge is the reasonable integration of ARSs into the course, as the use of ARSs requires some time, which was previously used for delivery of the lesson’s content [15]. On the other hand, the focus in education has shifted anyway from delivery of content to the design of an interactive teaching and learning process. Establishing such a cooperative learning environment is facilitated through integration of an ARS into teaching events. However, application of new technological achievements is not a necessary condition for better teaching, but the desired effects are achieved much more easily.

Another challenge of using ARS is the formulation of “good” questions. There is broad agreement about the characteristics of “good” questions [16]. A prerequisite for formulating good questions is the prior definition of the exact learning outcome. Questions are aimed at showing the progress in applying the concepts, models, and theories. Such questions may cover simple mental calculations, estimation of magnitudes, handling of equations, and interpretation of the results. All these topics are known to pose problems to some students [17]. Initial questions during a lesson should be relatively easy to answer so as to motivate students.

The final issue to be mentioned here is related to students’ feedback. Collection of feedback from students during a class helps to find out whether students understand a concept. The great challenge for teachers now consists in instantaneously adjusting the teaching style and offering a better explanation for misunderstood topics.

Reception by students

Students appear to have positive attitudes regarding the use of an ARS in classes. Reinforcement of content, provision of feedback, anonymity in participation, increased interest in the course, and the ability to compare one’s level of knowledge with that of the rest of the class have all been reported as positive characteristics of ARS use in lectures [1]. When students were asked if they would like voting to continue in lectures, only around 5% of the students answered “no” [12]. Using ARSs simply for the sake of technology and not for a pedagogical benefit is troubling to students [18].

ARSs provide an effective way to motivate students for cooperation. Participation of students in quizzes is usually very high. They appreciate self-testing and group dynamics. It is reported that self-assessment lectures using polls have higher attendance by students than the same lectures had previously [12]. Anonymous voting facilitates voting for reserved students.

To vote, students need to have a mobile phone signal or connect via a WLAN. The latter is preferential as it is important that costs for students are low and set-up times at the start of each lecture are short. A survey of students’ opinion of phone polls showed always more than 50% agreement with each of the following statements (multiple-choice question) [12]:
  • They are fun.

  • They provide a welcome break.

  • They helped me learn/understand.

  • They provide feeback for what I learnt.

  • They make me think.

  • They help with continuous assessment.

Less than 5% of the responses were received for each of the following statements:
  • They are pointless.

  • They are annoying.

  • They hinder my learning.

International students of the Erasmus Mundus Joint Master Degree “Excellence in Analytical Chemistry” (https://each.ut.ee/EACH/) were asked about their experience with software-based ARSs. More than 50% of them reported prior experience with this technology (Fig. 2, left). After the course, these students were asked for their opinion about the potential impact of ARSs. All of the responses were distinctly positive (Fig. 2, right).

The opinion of established European teachers about ARSs seems to be partly different from that of students. When attendees of the Symposium Education at Euroanalysis 2017 were asked for their opinion about ARSs, only a small percentage participated in the poll. For this reason, only a qualitative assessment of the responses can be made. As to their familiarity with ARSs, most of the responses were received for “I have no idea”. Distinctly fewer responders chose “I am interested” or “I already tried it”. The opinion of the teachers who voted on the potential of ARSs was more or less identical to the results obtained in Fig. 2 (right) from students.

The screenshots shown in Fig. 2 were taken from the ARS eduVote (http://www.eduvote.de/en/). This system was chosen because it is easy to use, provides a convenient login for participants by QR code, and integrates easily into PowerPoint. It is commercial software and requires either an individual or a university licence. The eduVote extension module for PowerPoint can be downloaded freely.

The ARS eduVote is a quantitative front-channel system [19]. Responses are immediately visible for everybody in the room (back-channel systems provide the teacher with hidden, sometimes non-anonymous feedback). Front-channel systems are particularly suited for teacher-focused educational formats, which usually dominate in science disciplines. The particular impact of front-channel systems is the continuous activation of students throughout the class.

Notes

Acknowledgements

I thank the students of the Erasmus Mundus Joint Master Degree “Excellence in Analytical Chemistry” for their frank cooperation. It was a unique opportunity to work with groups of students coming from around the globe. The prompt and quality-based support by SimpleSoft, the provider of eduVote, is gratefully acknowledged.

References

  1. 1.
    Bunce DM, VandenPlas JR, Havanki KL. Comparing the effectiveness on student achievement of a student response system versus online WebCT quizzes. J Chem Educ. 2006;83:488–93.CrossRefGoogle Scholar
  2. 2.
    Bonwell CC, Eison JA. Active learning: creating excitement in the classroom; ASHE-ERIC higher education report no. 1. Washington: George Washington University, School of Education and Human Development; 1991.Google Scholar
  3. 3.
    McKinney K, Heyl B. (Eds.) (2008) Sociology through active learning: student exercises. Pine Forge Press/Sage Publications, Los Angeles.Google Scholar
  4. 4.
    Kirschner PA, De Bruyckere P. The myths of the digital native and the multitasker. Teach Teach Educ. 2017;67:135–42.CrossRefGoogle Scholar
  5. 5.
    Junco R, Cotton SR. No A 4 U: The relationship between multitasking and academic performance. Comput Educ. 2012;59:505–14.CrossRefGoogle Scholar
  6. 6.
    Sana F, Weston T, Cepeda NJ. Laptop multitasking hinders classroom learning for both users and nearby peers. Comput Educ. 2013;62:24–31.CrossRefGoogle Scholar
  7. 7.
    Kundisch D, Magenheim J, Beutner M, Herrmann P, Reinhardt W, Zokye A. Classroom response systems. Informatik-Spektrum. 2013;36:389–93.CrossRefGoogle Scholar
  8. 8.
    Cain J, Robinson E. A primer on audience response systems: current applications and future considerations. Am J Pharm Educ. 2008;72, 77(4)Google Scholar
  9. 9.
    Mazur E. Peer instruction: a user's manual. Pearson: Harlow; 1996.Google Scholar
  10. 10.
    Dufresne RJ, Gerace WJ, Leonard WJ, Mestre JP, Wenk L. Classtalk: A classroom communication system for active learning. J Comput High Educ. 1996;7:3–47.CrossRefGoogle Scholar
  11. 11.
    Nicol DJ, Boyle JT. Peer instruction versus class-wide discussion in large classes: a comparison of two interaction methods in the wired classroom. Stud High Educ. 2003;28:458–73.CrossRefGoogle Scholar
  12. 12.
    Voelkel S, Bennett D. New uses for a familiar technology: introducing mobile phone polling in large classes. Innov Educ Teach Int. 2014;51:46–58.CrossRefGoogle Scholar
  13. 13.
    Nicol DJ, Macfarlane-Dick D. Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud High Educ. 2006;31:199–218.CrossRefGoogle Scholar
  14. 14.
    Kay RH, LeSage A. Examining the benefits and challenges of using audience response systems: a review of the literature. Comput Educ. 2009;53:819–27.CrossRefGoogle Scholar
  15. 15.
    Knight JK, Wood WB. Teaching more by lecturing less. Cell Biol Educ. 2005;4:298–310.CrossRefGoogle Scholar
  16. 16.
    Koenig K. Building acceptance for pedagogical reform through wide-scale implementation of clickers. J Coll Sci Teach. 2010;39(3):46–50.Google Scholar
  17. 17.
    Schlücker S. Das Smartphone – ein Antwortgerät. Nachr Chem. 2017;65:164–6.CrossRefGoogle Scholar
  18. 18.
    Draper SW, Brown MI. Increasing interactivity in lectures using an electronic voting system. J Comput Assist Learn. 2004;20:81–94.CrossRefGoogle Scholar
  19. 19.
    Ebner M, Haintz C, Pichler K, Schön S. Technologiegestützte Echtzeitinteraktion in Massenvorlesungen im Hörsaal. Entwicklung und Erprobung eines digitalen Backchannels während der Vorlesung. In: Rummler K, editor. Lernräume gestalten – Bildungskontexte vielfältig denken. Münster: Waxmann; 2014. p. 567–78.Google Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Chemistry and Food ChemistryTechnische Universität DresdenDresdenGermany

Personalised recommendations