The operant conditioning chamber—most commonly known as the “Skinner box” after its inventor, the great American behaviorist B. F. Skinner (1904–1990)—is still today the standard apparatus for the experimental study of animal behavior in psychology and behavioral neuroscience (see Skinner, 1938). This apparatus basically consists of a chamber with light bulbs and speakers mounted on the walls in order to provide visual and auditory stimuli; a food dispenser connected to a magazine or hopper, which can deliver food pelletsFootnote 1; and a metal grid floor, which can deliver mild electric foot shocks. In a standard operant conditioning chamber for rodents, the chamber also has a lever protruding from the wall, which can be depressed (operant conditioning chambers for pigeons will instead have keys that can be pecked). In this preparation, lever-pressing is the standard operant response that the animal can produce in order to interact with, or operate upon, the environment. Depending on the contingencies or relationships programmed by the experimenter, lever-press responding can result in either the delivery of food (i.e., positive reinforcement) or the removal or prevention of shock (i.e., negative reinforcement), as well as in the delivery of shock (i.e., positive punishment) or the removal or prevention of food (i.e., negative punishment). The audiovisual stimuli can serve as discriminative stimuli, signaling to the animal the opportunity to respond in order to obtain the desired outcome. For example, a light can indicate that lever-pressing would result in a food pellet, whereas a tone can indicate that the same response would cause the delivery of a foot shock.

Unfortunately, the Skinner box is an expensive apparatus. Prices from the main manufacturers and providers in the U.S. range between $3,500 and $4,000 for one standard operant conditioning chamber. Setting up a laboratory with eight Skinner boxes (a number normally required in order to conduct experiments with large numbers of animals) might end up costing between $45,000 and $50,000, after including the interface cabinet (necessary to connect a computer to the boxes) and the controlling software. Even in the simplest case, in which a single Skinner box is connected to a laptop via a standalone USB interface, the price (excluding laptop) would top $6,000. The high price of this equipment makes it extremely difficult for young researchers to start their research projects, and thus to seek external funding, since providing pilot data is often a precondition for obtaining funding—hence, creating a vicious circle for the researcher. This high price also precludes many smaller colleges, community colleges, and high schools, as well as most educational institutions in developing countries, from being able to set up laboratories for teaching laboratory courses on the principles of animal behavior.

Building a traditional Skinner box on your own has always been a possibility. However, building this device would still be a laborious and expensive endeavor, which would also require expertise in electronics and programming. Two relatively recent technological developments offer an alternative, much simpler and cheaper solution: Apple’s iPod Touch (Apple, Cupertino, CA) and the Arduino microcontroller (Smart Projects, Ivrea, Italy). The iPod Touch (see www.apple.com/ipod-touch) was first released in 2007, simultaneously with the release of the iPhone. With the exception of the iPhone-exclusive features (e.g., phone calls or GPS), the iPod Touch allows the user to enjoy any application developed for the iPhone. Arduino (see www.arduino.cc) began as a project in 2005 with the aim of allowing students to build affordable electronic systems. Simply put, the Arduino environment allows the user to build complex electronic devices by connecting components (e.g., LEDs, LCDs, motors, . . .) to an Arduino board, which can also be directly connected to a computer via USB. The Arduino software (an open platform) makes it easy to upload the controlling software to the Arduino board. Because of its great flexibility and affordable price, the Arduino microcontroller makes an excellent candidate for the development of devices involving physical computing, including the construction of experimental devices for psychological research (see D’Ausilio, 2012). Both platforms (iOS and Arduino) can be combined to achieve a number of feats, such as remote connections to sensors and video cameras. Here we propose that these two platforms can also be combined to build a simple, yet functional, Skinner box (the ArduiPod Box) for less than $300.

Overview of the device

The operation of the ArduiPod Box is fairly simple, as can be appreciated from Fig. 1. The central component of the system is the iPod Touch, which runs an app specifically designed to present the animal with the stimuli and collect the animal’s responses. The iPod Touch is mounted on the wall of the home cage, and presents visual stimuli through the screen. If necessary, auditory stimuli can also be produced through the iPod’s built-in speaker, through a speaker connected to the headphone jack, or even via the Arduino microcontroller. The responses collected are screen touches: The animal’s touches on the screen are registered and saved for analysis. Because animals will not be intrinsically motivated to interact with the iPod Touch, a system of external rewards must be put in place. Here is where the Arduino comes into play, by relaying the commands received from the iPod Touch to a servomotor. The motor’s movement consequently results in the delivery of a reward (i.e., food pellets or water).

Fig. 1
figure 1

Basic operation of the ArduiPod Box: An iPod Touch presents the animal with stimuli (i.e., colored lights on the screen) and detects and registers the animal’s responses (i.e., nose pokes on the screen). Contingent upon the animal’s response, the iPod Touch can also send a signal to the Arduino Uno microcontroller, thereby triggering the action of a servomotor, which results in reinforcement (i.e., the delivery of food or water)

Hardware

The picture in the top panel of Fig. 2 shows the electronic components used in the ArduiPod Box. These are, from left to right, the iPod Touch, the Arduino Uno microcontroller (or, alternatively, a “Bareduino,” a “home-made” clone of the Arduino Uno, which uses the same microcontroller as the standard Arduino Uno, Atmel’s ATMEGA328P-PU),Footnote 2 the Redpark C2-DB9 serial cable (necessary to connect the iPod Touch to the Arduino),Footnote 3 and a servomotor.

Fig. 2
figure 2

Top panel: Electronic components used in the ArduiPod Box. Bottom panel: A prototype of the ArduiPod Box, which uses the delivery of food as the reinforcer. The components shown in the top panel are already connected and installed to the box

The picture in the bottom panel of Fig. 2 shows these components,Footnote 4 already connected and installed on a cage,Footnote 5 in a prototype that uses food as the reinforcer (see the Experiment section for a description of a prototype using flavored water as the reinforcer). This cage is a standard clear plastic cage for rodents with two modifications: First, a servomotor is attached to the top metal grid, with a small plastic bottle attached to the shaft. The bottle has a round hole (i.e., approximately 12 mm in diameter) on a side. The action of the servo quickly rotates the bottle to the side of the hole and, then, quickly rotates it back to the original position. As a consequence, the movement of the servo results in the delivery of food contained in the bottle (e.g., seeds or pellets). Second, the front wall of the box has a 380 × 380 mm window (see the top panel of Fig. 3), an opening that exactly matches the response key, a button on the screen of the iPod Touch, on which the animal can poke during the experimental sessions (described in the next section). This wall also has four screws that will allow for mounting the iPod on the outside of the wall with two elastic bands (see the middle panel of Fig. 3). Thus, from inside the box, only the area of the screen that corresponds to the previously mentioned button is visible and accessible (see the bottom panel of Fig. 3).

Fig. 3
figure 3

Top panel: Close-up view of the opening on the front wall of the plastic box, which fits the area of the screen of the iPod Touch containing the button that will present stimuli and detect/register responses from the animal during training (i.e., the response key). Middle panel: Same view, now with the iPod Touch already mounted on the wall. Bottom panel: View from inside the box. As can be appreciated, only a small portion of the screen of the iPod Touch will be visible and accessible to the animal

It is important to point out that the hardware employed for the ArduiPod Box will vary slightly depending on the specific type of reinforcer to be used (i.e., food or water). In addition, the same reinforcer could be delivered in a variety of ways (e.g., the action of the servomotor could result in food being dropped in a cup, or in the opening of a gate to grant temporary access to a magazine filled with food). Therefore, the two prototypes presented in this article must be taken as mere examples, and by no means should determine, or even less constrain, the potential development of the ArduiPod Box in the future.

Software

The activity of the ArduiPod Box relies on the joint operation of two pieces of software: an iOS app (run by the iPod Touch) and a sketch (run by the Arduino microcontroller).Footnote 6 Although the ArduiPod Box requires both the iOS app and the sketch to operate, the bulk of the processing is performed by the iOS app. In fact, the Arduino’s sketch is only in charge of carrying out a single action: activating the servomotor (to deliver food or water) each time it receives a byte (with a value equal to 1) from the iPod Touch. In contrast to the simplicity of the Arduino’s sketch, the iOS app is in charge of executing all other actions in the ArduiPod Box, such as presenting stimuli to the animal and detecting and registering the animal’s responses (i.e., screen touches). Also, contrary to the Arduino’s sketch, which does not require any input from the user, the iOS app used by the ArduiPod Box needs to be configured by the user prior to the experimental session. Thus, the user will need to become somewhat familiar with this app in order to use the ArduiPod Box.

Fortunately, using the iOS app is extremely easy. In fact, because this app was developed using Apple’s iOS SDK and makes use of the standard elements of the iPhone interface (e.g., buttons, steppers, segmented controls, alert views, . . .), any average user of iOS devices (i.e., iPhone, iPod Touch, and/or iPad) should be able to navigate this app right away. The app is named Shaping because it was exclusively developed to train the experimental subject to perform the target operant response (i.e., nose poke) using the shaping procedure (viz. reinforcement by successive approximations; Skinner, 1951, 1953), although it can easily be adapted to implement more complex experimental treatments in future revisions. The app was developed as a standard “utility application,” which is composed of two views (or “screens”), the main view and the flip-side view. Figure 4 depicts screen captures of these two views.

Fig. 4
figure 4

Screen captures of the Shaping app during the experimental treatment (left panel) and during the settings configuration (right panel)

The screen capture on the left panel of Fig. 4 depicts the presentation of a stimulus (i.e., a blue “light”) in the main view. This stimulus is actually a standard round button, 250 × 250 points in size (i.e., 500 × 500 pixels on a retina display), that can change color during the experimental treatment. During the intertrial interval, it has a clear color (i.e., it is invisible), whereas during the stimulus presentation it adopts a visible color, such as green or blue (i.e., it “turns on”). This button (i.e., the response key) is the only element from the screen that the experimental subject will interact with, because the opening on the clear plastic wall of the home cage fits perfectly with the dimensions of this button (see the pictures in Fig. 3). When this button is pressed on the designated trials, the iPod Touch sends a signal to the Arduino microcontroller, which activates the reward delivery system.Footnote 7

Pressing the button on top, with a label that reads either “Start training session” or “Stop training session,” will either start the experiment (i.e., provided that the experimental treatment had been configured; see below) or stop an ongoing treatment. The app automatically saves a text file with the results of each experimental session (along with a summary of the settings for each session), and the “Mail” button (bottom-left corner) provides a convenient way to export these data by automatically creating an e-mail attachment of the data file.Footnote 8 Because the text file can quickly grow in size after running a few experimental sessions, the “Trash Can” button provides a quick way to erase the data file. Finally, the button with the “Info Sign” (bottom-right corner) pushes the flip-side view, in which the user can configure the settings for the experimental session.

The screen capture in the right panel of Fig. 4 depicts the settings screen. On this screen, the user can configure the following settings for the experimental session. First, the type of training for the session can be chosen by selecting among the three options in the segmented control at the top of the screen, namely: (1) no discriminative stimulus (i.e., by selecting “0”), in which case the screen will remain black and all responses will result in reinforcement (i.e., with the exception of those responses produced during the delivery of the reinforcerFootnote 9); (2) only a discriminative stimulus for reinforcement (i.e., by selecting “S+”), in which case only responses produced during the stimulus presentation will be reinforced; or (3) discriminative stimuli for reinforcement and nonreinforcement (i.e., by selecting “S+/S–”), in which case responses produced during the presentation of S+, but not during the presentation of S–, will be reinforced.Footnote 10 These three options allow the experimenter to program a progression in the shaping sequence, from a response (i.e., nose poking on the touch screen) that is continuously reinforced (i.e., option “0”), to a response that is controlled by an antecedent stimulus (i.e., option “S+”), and finally, an option involving a successive stimulus discrimination treatment (i.e., option “S+/S–”).Footnote 11

Second, two segmented controls allow the user to choose the specific color (for option “S+”) or colors (for option “S+/S–”) to be used in the experimental session. The colors blue and green are used because they are in the visible spectrum of rodents (see Jacobs, Fenwick, & Williams, 2001). Although option “S+/S–” was mainly included in the app to permit the study of stimulus discrimination, choosing the same color for both S + and S– is also possible. In this case, the experimental session will follow a partial reinforcement schedule using a single discriminative stimulus, with 50 % of the stimulus presentations resulting in the opportunity for reinforcement.

Third, the user can optionally present a constant background sound, either a 100-Hz square wave tone or a white noise. These sounds can serve as a contextual cue for the experimental treatment, which could be useful for studying phenomena involving contextual manipulations, such as generalization decrement (e.g., presenting the blue color with the white noise at test, following training of the blue color as S + in the presence of the tone; see, e.g., Pearce, 1987) or feature (positive or negative) discriminations (e.g., training blue and green colors as S + and S–, respectively, in the presence of the tone, and as S– and S+, respectively, in the presence of the white noise; see, e.g., Holland, 1992).

And, fourth, the user can select the duration of the stimulus presentation and the intertrial interval (ITI), as well as the number of trials for the experimental session. (The ITI is only operative in options “S+” and “S+/S–,” setting the gap between two consecutive stimulus presentations. In option “0,” no ITI is introduced, since responses are detected continuously and in the absence of an explicit stimulus signaling reinforcement.) The default value for the stimulus duration is 10 s, but it can be changed (by clicking on the – or + signs of the associated stepper) to any value between 1 and 60 s. Likewise, the default value for the ITI is 30 s, but it can be changed to any value from 1 to 600 s. Finally, the default number of trials is 100, but it can be changed to values from 10 to 1,000 (i.e., clicking on the stepper results in increments/decrements of 10).

Once the settings have been selected, pressing the “Done” button (top-left corner) will return the screen to the main view. Because the app now has all necessary parameters for the experimental session, it is ready to run the experimental session: The “Start training session” button will now be enabled, and upon pressing it, the experimental session will start (i.e., following a 20-s delay, established to give the experimenter time to mount the iPod on the wall with the elastic bands).

Peer-to-peer connectivity and data plotting

Although it is not strictly necessary in order to conduct a study using the ArduiPod Box, a second application was developed, aiming to aid the experimenter or instructor in monitoring the progress of the instrumental conditioning session. This app, named ArduiPodChart,Footnote 12 connects wirelessly to the Shaping app and displays a graph with the trial-by-trial numbers of responses given by the animal during the training session (see Fig. 5 for two screen captures depicting this app in action). In addition, the app displays a summary of the treatment parameters—namely, training type (i.e., no stimulus, S + only, or S+/S–), number of trials, colors assigned to S + and S–, duration of stimulus presentation, and ITI). Finally, the app displays a few pieces of real-time information: the stimulus being presented on the current trial, the current trial number, and the number of responses that the animal is making on the current trial.

Fig. 5
figure 5

Screen captures of the ArduiPodChart app, indicating the stimulus currently being presented and the current total number of responses (left panel), and after the trial, with the corresponding data point inserted (right panel)

In order to use this app, a second iOS device is required. Connecting the Shaping and ArduiPodChart apps is a simple, two-step process: First, in order to make the iPod Touch used in the ArduiPod Box searchable, the switch on the bottom of the main view in the Shaping app must be set to “Online.” Second, the iPod Touch must be located and selected from the device using the ArduiPodChart app. To do this, simply touch the magnifying-glass icon at the top-right corner of the app and select the corresponding device from the list.Footnote 13

Experiment

A pilot experiment was conducted in order to test the ArduiPod Box. This experiment merely aimed to determine whether the ArduiPod Box can effectively serve as an instrument for the study of instrumental behavior with rats and, possibly, other rodent species. Specifically, the purpose of the study was to ascertain whether the target instrumental response (i.e., nose-poking on the display of the iPod Touch) could be established, as well as whether the response could be brought under stimulus control. The experimental procedure was reviewed and approved by the Hofstra University Institutional Animal Care and Use Committee (IACUC).

Given the extremely simple and unambitious nature of this experiment,Footnote 14 one single rat was used as a subject: a female “fancy rat” (i.e., Rattus norvegicus) purchased from Petsmart (Store #1446, located in Levittown, NY). The rat weighted 180 g at the start of the study and was housed in a large Plexiglas cage (48.26 × 26.67 × 20.32 cm). The animal was maintained on a water deprivation schedule during the experiment, with daily access to tap water for about 1 h after the termination of the experimental session.

The ArduiPod Box employed in this experiment used flavored water as the reinforcer (i.e., sugar was mixed with water in order to enhance the hedonic value of the reinforcer). Specifically, the reinforcer consisted of limited access to a solution containing 10 % sucrose (obtained from Sigma-Aldrich Chemie GmbH, Steinheim, Germany). The sucrose solution was delivered by an 8-oz (i.e., approximately 236.5-ml) glass bottle fitted with a 2.5-in. (6.35 cm) stainless steel spout containing ball bearings. By default, the bottle would be retracted, and thus, the spout would not protrude into the cage. However, when the rat touched the display of the iPod Touch on the designated trials, a servomotorFootnote 15 controlled by the Arduino microcontroller would slowly allow the bottle to slide toward the cage, thereby introducing the spout into the cage. Then, 5 s later, the bottle would be slowly retracted back to its original position.

The experiment was conducted in nine daily experimental sessions, arranged in four stages. All experimental sessions were 50 min in duration. The first stage consisted of two daily sessions, during which the rat was hand-shaped to make contact (by successive approximations) with the display of the iPod Touch. By the end of the second session, the rat was reliably exploring the display and, thus, was highly likely to make the target response. The second stage consisted of two daily sessions of unsignaled trials—that is, during which no discriminative stimulus was presented.Footnote 16 Each session comprised 100 trials, and the trial duration was set to 30 s. The treatment in the third and fourth stages consisted of signaled trials. Specifically, the third stage consisted of three daily sessions, also comprising 100 trials each. Each trial consisted of a 10-s presentation of a green color, which played the role of a discriminative stimulus for reinforcement (S+), followed by a 20-s ITI during which the screen remained black. Finally, the fourth stage consisted of two daily sessions, again comprising 100 trials each. In each session, 40-s presentations of the green color (i.e., still serving as S+) were randomly interspersed with 40-s presentations of the blue color, which served as S–. Because S + and S– presentations were randomly chosen by the device, the number of presentations of each stimulus in a single session could not be set beforehand, but was always kept close to 50. As in the previous stage, color presentations were followed by a 20-s ITI, during which the screen remained black.

The data collected by the ArduiPod Box in the experiment are depicted in Figs. 6 and 7. (Note that, for the sake of clarity, the graphs depict the cumulative numbers of responses.) As can be appreciated from the top panel of Fig. 6, the target response was acquired during the stage comprising treatment with no discriminative stimulus, starting on the 33rd trial in Session 1, and later becoming even more robust during Session 2. (A video showing this rat’s performance in the ArduiPod Box is available at http://tinyurl.com/ArduiPodBox.) Moreover, high rates of responding were also notable during the training sessions involving the presentation of the discriminative stimulus (S+), as is shown in the bottom panel of Fig. 6. Unfortunately, the results from the training sessions involving a successive S+/S– discrimination treatment were far from perfect: As can be appreciated in the top panel of Fig. 7, higher rates of responding were observed in the presence of S– (i.e., blue color) than in the presence of S + (i.e., green color). These results could be interpreted as being due to a failure to discriminate between the two colors: The rat simply responded indiscriminately to both colors, but lower response rates were observed to S + than to S– because responding during S + (but not during S–) yielded access to the water reinforcer for 5 s, and drinking water was incompatible with performing the target response. Alternatively, it is possible that the rat had correctly learned the stimulus discrimination, but presumably persisted in her response during S– due to frustration induced by the omission of the expected reinforcement (for a review of the role of frustration in stimulus discrimination learning, see Amsel, 1992).

Fig. 6
figure 6

Results of the pilot experiment, conducted to test the ArduiPod Box. Lines represent the cumulative responses (i.e., nose pokes or touches on the display of the iPod Touch) in each session. The top and bottom panels depict the results from stages involving no discriminative stimuli and a discriminative stimulus for food (S + only), respectively

Fig. 7
figure 7

Results of the pilot experiment, conducted to test the ArduiPod Box. Lines represent the cumulative responses (i.e., nose pokes or touches on the display of the iPod Touch) in each session. The top panel depicts the results from a stage involving a successive discrimination training (S+/S–), whereas the bottom panel depicts the results of an extinction test in which neither S + nor S– signaled reinforcement

In order to contrast these alternative explanations, an additional ten-trial session was conducted during which the rat received five presentations of each stimulus, S + and S–, interspersed. As in the fourth stage of training, at test the presentations of the discriminative stimuli were 40 s in duration, with 20-s ITIs separating the stimulus presentations (i.e., this test was 10 min in duration). Importantly, during this test phase, neither stimulus was reinforced, and hence this test allowed us to assess responding to S + and S– in conditions not contaminated by the disruption of the target response (i.e., nose-poking) caused by the delivery of the reinforcer (i.e., water drinking). As can be appreciated in the bottom panel of Fig. 7, which shows the cumulative number of responses during this test, responding to S + was stronger than responding to S–, a result that indicates that the stimulus discrimination procedure had been correctly learned by the animal.Footnote 17

Conclusion

The present article introduced a low-cost and open-source version of the operant conditioning chamber, or Skinner box, built using two main components: an iPod Touch and an Arduino microcontroller. This device, which we named ArduiPod Box, aims to provide those with an interest in animal learning and behavior, for research or instruction purposes alike, with a very inexpensive alternative to the costly standard apparatus. Although this device will not likely find a niche in laboratories already equipped with standard apparatus for the study of animal behavior, it might be of great use for researchers struggling to set up a laboratory with limited startup funds in smaller colleges and community colleges, as well as in most educational institutions in developing countries. Moreover, this device will make it easier for colleges and high schools, which normally could not afford the expenses associated with the experimental equipment, to set up laboratories for teaching hands-on laboratory courses on the principles of animal behavior.

Low cost aside, there is another important reason to consider the use of this device in an animal learning laboratory: its virtually unlimited potential to be adapted or expanded for other uses (a potential that, incidentally, also comes with a very low price tag). For instance, whereas the standard Skinner box requires the installation of additional modules for the presentation of new stimuli, the ArduiPod Box can easily be reprogrammed in order to present new stimuli, including complex audiovisual stimuli such as pictures, video, and sounds (including music).Footnote 18 Likewise, registering new responses in the Skinner box requires additional manipulanda, which, once again, means having to install additional modules. By contrast, the ArduiPod Box could be reprogrammed to collect other, more complex responses, such as double taps or swipes on the iPod Touch’s display. In addition, the Arduino microcontroller, which in the current prototypes is merely in charge of controlling an actuator (i.e., the servomotor), could easily be connected to a variety of sensors to collect a wide range of information about the animal’s behavior, such as movement (e.g., using a pyroelectric infrared sensor or an accelerometer) or head entries (e.g., using an infrared beam sensor or an ultrasonic distance sensor), in addition to the most obvious option—namely, a traditional lever or button (e.g., using momentary push-button switches). Finally, the ArduiPod Box brings new possibilities, such as synchronization of data with the cloud (i.e., online storage systems such as Apple’s iCloud, Dropbox, or Google Docs), or even automatically sending alerts with relevant information via e-mail, instant message, or Twitter (something that could be very useful in settings involving continuous monitoring over extended periods). Implementing these or similar features in a traditional operant chamber system would be, if not impossible, a real challenge.

Certainly, the ArduiPod Box is not without problems (as is shown by the results of the experiment here reported), at least in its current version. However, as an open-source device, the ArduiPod Box could be tremendously transformed in a short time, as it is improved or even adapted and modified to fit new uses by a thriving community of developers and makers, some of whom also hold a passion for the science of animal learning and behavior. Moreover, this device could also encourage young researchers to adopt a DIY philosophy, thereby investing time and effort to create their own experimental apparatus. With time, we might once again experience technological innovation in our research field, a field that enjoyed its most fertile moments during the 20th century thanks in large part to the tradition initiated by B. F. Skinner, great scientist and ingenious DIY maker.