Abstract
This paper discusses aspects of recruiting subjects for economic laboratory experiments, and shows how the Online Recruitment System for Economic Experiments can help. The software package provides experimenters with a free, convenient, and very powerful tool to organize their experiments and sessions.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
The subject pool is the most precious resource of the experimental economist. It is the source of our data and insights, and a necessary condition for our academic work. Dealing with it mindfully and cautiously is not only a requirement of human subject research ethics, but also a precondition of valid data and valid conclusions from an experimental study.
There are several reasons why we should care about recruitment procedures (and this list is not exhaustive):
-
1.
To minimize unobserved or unwanted selection effects, and to fully control for selection criteria imposed for the purpose of the research project (for example, based on subject demographics or participation in previous studies);
-
2.
To prevent multiple participations of the same person in an experiment since this may invalidate the data and conclusions;
-
3.
To minimize the direct costs of both recruitment and maintaining a subject pool; and
-
4.
To ensure that we obtain exactly the right number of participants, not too many (which is costly in terms of turn-away fees) and not too few (which is even more costly due to a potentially insufficient number of observations, or required minimum group sizes).
In the early days of experimental economics, before the rise of the Internet and database-backed web applications, a wide range of procedures were used to recruit participants for economic experiments. This included having troops of research assistants approaching people individually, and distributing hundreds of leaflets and signup lists across campus. Since the days of e-mail technology, long (and often inaccurate) e-mail lists have been maintained. Participation would be tracked using Excel lists to prevent multiple participations, but with less than perfect results.
The Online Recruitment Software for Economic Experiments (ORSEE) was introduced in 2003, one of the first of its kind.Footnote 1 ORSEE is a software tool that allows researchers to schedule experiment sessions and recruit participants. It tracks experiment participation and provides information about the subject pool and recruitment procedures of a study. This software aims to simplify the organization of economic laboratory experiments and to reduce its cost, to allow for the standardization of the procedures of experiment organization, to depersonalize the experimenter–subject interaction, and to provide information and statistics about the subject pool and recruitment procedures.
In terms of the critical aspects of recruitment, using ORSEE immediately reduces the costs of subject recruitment (point 3 above) by saving valuable research assistant time, and makes recruitment more efficient. The database-backed system is also able to effectively prevent multiple participations in the same experiment (point 2), conditional on that each subject holds only one account in the system. (This can be enforced, for example, by requiring subjects to use their student ID or university e-mail address when registering for the subject pool.) A set of strict laboratory rules combined with a reputation system that tracks no-shows can effectively reduce the variability in the number of participants who show up for sessions (point 4).
The trickiest issue is selection (point 1). In the context of ORSEE, the selection can take place at two different stages: (1) during registration for the subject pool database, and (2) during the enrolment for sessions of a particular experiment. The means of a recruitment software to address the first stage of selection into the recruitment database are limited. Such selection will largely depend on the laboratory’s efforts to attract students and other populations to create profiles in the system.Footnote 2 However, at the second level of selection into experiments, ORSEE can help to reduce selection biases in various ways.
Both stages of selection are likely affected by similar factors. While there is a large and growing literature on whether student samples are representative of broader populations, there is surprisingly little research on non-random selection of students into participation in laboratory experiments. Cleave et al. (2013) studied the total selection effect over both stages by first running trust games and lottery choices with the total number of students in an introductory microeconomics course, and then examined whether the 12 % of the initial population who later enrolled in the database and signed up for an experiment differed from the original population. They found no difference in risk or social preferences among those who signed up compared to the population. Similarly, Falk et al. (2013) examined whether students who donated to a charity were more likely to participate in laboratory experiments than students who did not, but also found no effect. Harrison et al. (2009) and Slonim et al. (2013) directly recruited for experiment sessions. Harrison et al. (2009) varied the spread of earnings advertised in the recruitment e-mail sent to potential subjects, and found that participants who signed up in response to a recruitment e-mail advertising a lower variance were less willing to take risks in the laboratory experiment. Slonim et al. (2013) obtained demographics and choices in a variety of games from a full sample of an undergraduate microeconomics class. The students were then invited to take part in economic experiments. The authors found strong demographic effects of selection into experiments: those with less income, more leisure time, more interest in economics, and who were more pro-social were more likely to participate in laboratory experiments.Footnote 3 Based on the literature and their own results, Slonim et al. (2013) made recommendations to reduce selection biases in experiments: high rewards or even compulsory course credits, short laboratory sessions, convenient laboratory locations, and providing as little information as possible about experiments when recruiting. They also discussed possible econometric ways to compensate or correct for selection effects.
ORSEE addresses selection to experiments in the following ways:
-
Extensive participant statistics which can be aggregated at various levels of the subject pool allow the researcher to get an in-depth picture of the subject pool she is going to recruit from.
-
The use of an impersonal software for recruitment, generic e-mail templates, an institutional laboratory e-mail address, a public generic name for experiments etc. all aim to reduce experimenter-subject interaction in the recruitment and prevent biases due to too much information in the recruitment process.
-
A sophisticated query tool that allows researchers to counterbalance an existing non-representativeness of the subject pool compared to the university population, as well as demographics-based enrollment biases.
-
The ORSEE recruitment report facilitates ex-post assessment of the extent of selection bias through detailed comparative subject pool statistics at three levels: the complete subject pool, the pool of eligible subjects, and the subset of subjects who eventually enrolled and participated in the study. These data may be of use to control for biases econometrically (Slonim et al. 2013).
Using ORSEE provides another advantage in the evaluation process of experimental research: since it standardizes recruitment procedures to a certain extent, there is additional value (other than complying with ORSEE’s license) to mention the use of this software in the recruitment of experiment subjects, as researchers/reviewers familiar with ORSEE will know immediately how subject recruitment was organized. The wealth of information collected in the system and provided by the ORSEE recruitment report also facilitates the replication of experimental studies down to the details of the recruitment procedures. Researchers wishing to replicate a previous study can use the same selection conditions, the same messages, and follow the exact same timing as in the original study, provided both laboratories use ORSEE for recruitment.
That said, one guiding principle in the design of ORSEE is to make it as flexible as possible, and accommodate as many research procedures as possible under the general architecture. In the end, the researcher and the laboratory as an institution will be responsible for proper recruitment procedures and efforts to reduce selection biases. ORSEE provides many tools and features, but the maintenance of an active subject pool, the communication with subjects, and the establishment of rules and procedures are all in the hands of the researchers.
The remainder of this paper is devoted to a description of the general features of the software, a list of configuration options demonstrating the software’s flexibility, as well as some technical details and license information. A * will indicate features that are introduced or enhanced in Version 3.0 of the software.
2 How does ORSEE work?
2.1 Installation
ORSEE is installed on a webserver, and is accessed by participants and researchers via the Internet. The software is written in PHP and uses a MySQL database back end. It is developed on Linux, but it runs in many other environments, including Windows (with Apache or IIS as a webserver) and MacOSX (with Apache). The software is easily installed by extracting the downloaded file into the webserver’s directory, importing the default MySQL database, and editing a short configuration file. All data and most configurations/customizations are kept in the local database, which is stored on the local ORSEE installation server.
ORSEE’s homepage is located at www.orsee.org. A full documentation of the software can be found in the ORSEE Wiki (www.orsee.org/wiki), and the software package can be downloaded from sourceforge.net/projects/orsee. Further, test systems of current software versions can be found on the ORSEE test website (www.orsee.org/test).
2.2 Registration for the subject pool
ParticipantsFootnote 4 register with the recruitment system by self-selecting into a sub-subject pool, accepting the laboratory’s general rules, and completing a (customizable*) profile form.Footnote 5 Once the registration is confirmed (which ensures validity of the participant’s e-mail address), the participant can sign up for studies she is eligible for.
The public part of the recruitment system webpage can also include a page defining general rules for the laboratory and experiment participation (e.g. a no-show reputation scheme which stipulates exclusion from further invitations after a certain number of no-shows), a page listing frequently asked questions (FAQs), and pages for privacy policy, legal notice, or contact details of the laboratory. A public calendar displays upcoming experiments (with only ‘public’ experiment names and few session details exposed). Any of these public pages can be hidden* or shown by a switch in the system configuration.
2.3 Setting up an experiment
Researchers can log into the system using their username and password.Footnote 6 Current and past experiments can be searched* by experimenter name or experiment classes (one or multiple tags given to an experiment, e.g. ‘public good game’ or ‘cognitive reflection test used’). In addition, the experiment list contains information on, and quick access links* to, upcoming sessions.
A researcher creates an experiment by stating an internal (meaningful) and a public (potentially generic, e.g. Study2015-05) name for the study, the experiment type, and further details (e.g. the experimenters, one or more* ‘experiment classes’, ethics approval details* for the experiment,Footnote 7 a budget* to deduct participant payments from, or a ‘public comment’* which is displayed next to the experiment name on the session enrollment page, etc.). Experiment types can be laboratory experiments, online surveys, or external Internet experiments (with the latter two types under current development). The ‘experiment main page’ collates all functions that can be executed for a particular study. For laboratory experiments, it is divided into three parts. The first part lists basic properties of the experiment, the second part is devoted to the organization of experiment sessions, and the last part deals with participants and recruitment communication.
In the session part, the experimenter can view the status of the experiment sessions, access participant lists, and edit existing sessions or add new sessions. When adding a session, the experimenter determines the date, time, and duration of the session, the target number of participants and how many participants to over-recruit (to compensate for potential no-shows), the registration time window, and when the session reminder e-mail will be sent. The experimenter can also add a session-specific public comment* which is displayed next to the session on the experiment enrollment page. Upon creation of the session, the system checks for overlaps with other bookings in the same laboratory and displays a warning if that is the case. Sessions can be ‘planned’ (no enrollment possible yet), ‘live’ (enrollment possible), ‘completed’ (participation data filled in), or ‘balanced’ (payment information filled in).* After a session has been created, a ‘copy’ button* can be used to create a copy of that session, to avoid having to fill in all the details again.
2.4 Selecting eligible participants
After sessions have been created, the experimenter can assign participants to an experiment.Footnote 8 The sophisticated subject pool query tool* allows the experimenter to construct any kind of query, from simple to complex, by including additional conditions. Queries allow conditions to be placed on participant profile data (demographics etc.), participation history, and other participant-related data collected by the system.Footnote 9 Sub-queries can be combined by AND or OR statements and bracketed.Footnote 10 The results list that is displayed after submitting the query can be restricted to a randomly selected subset of all participant profiles that match the query. From the results, the researcher can then assign all or a subset of the returned participants to the experiment, thereby making them eligible to sign up for ‘live’ sessions of the experiment. Once participants are assigned, similar queries can be run for the set of assigned participants in order to deassign some or all of them.
Any queries that result in the assignment or deassignment of participants to the experiment are saved* by the system. The queries can be reused* for further assignments (e.g. of additional random subsets in case enrollments are slow), and these queries will be part of the ORSEE recruitment report* (see below in Sect. 2.7). Any executed query can also be ‘activated permanently’*, such that if new participants who register for the system match the ‘permanently active’ query, then they are automatically assigned to the experiment and receive an invitation e-mail if there are ‘live’, unfilled sessions.
2.5 Session enrollment
Once participants are assigned, a (customizable) invitation e-mail can be sent to eligible participants for the experiment. E-mailing stops automatically* when all ‘live’ sessions have been filled. The invitation e-mail includes a list of upcoming sessions and a link to a session enrollment webpage, where the participant can sign-up for an experiment session on a particular date and time. This webpage also has a mobile version* when accessed from a mobile device. The participant will receive a confirmation e-mail, and another reminder e-mail shortly before the session, both of which can be customized to the experiment as well. At any time, participants can see a history of their prior experiment participation, update the data in their profile, or unsubscribe from receiving further invitations.
2.6 Monitoring enrollment and updating the database
The enrollment status of ‘live’ sessions (number of required and enrolled participants) can be monitored on the main page for the experiment or on the internal calendar. When the registration period for a session elapses, the experimenter receives an e-mail about the session’s registration status, including a list of enrolled participants in PDF form.
After the researcher has conducted an experiment session, information about subjects’ participation status*Footnote 11 (and, if enabled, payment information*) can be added to the system. The system also allows for ‘bulk completion’* of a session list. Once participation information has been filled in, the researcher changes the session status to ‘completed’ or ‘balanced’. Then the show-up data is transferred into the no-show reputation system, and the payoff data* is included in the budget reports. Participants may be automatically excluded from further invitations based on their participation history (i.e. number of no-shows).
2.7 ORSEE recruitment report
At any point, but most likely after running all experiment sessions, the researcher can generate an ORSEE recruitment report*. The report lists the main experiment parameters, the sessions and the number of participants in each session, as well as all participant assignment/deassignment queries and results. In addition, the report provides information on demographics and experiment experience at three levels side-by-side: (1) the whole ORSEE sub-subjectpool from which participants were recruited, (2) the subset of participants eligible for the experiment, and (3) the subset of participants who eventually participated in the experiment. The report provides some basis to judge potential selection effects, and can be attached as supplementary material to a journal paper submission, providing information about recruitment and session organization.
2.8 Maintaining the subject pool
The researcher can manually add participants to the database, or search the current participant pool or past participants using the same query tool as described above for assigning participants to experiments. ‘Bulk actions’* (like mass e-mails, status changes, etc.) can be applied to all participants in the search query result set or a selected subset. The ORSEE administrator may also search for inactive participants (e.g. no experiment participation within the last 6 months) and send an e-mail to request them to update their profile page within a certain timeframe. Profiles that have not been updated can then be unsubscribed in one batch.
Participant profile pages include complete histories of experiment participations. A profile can be manually edited. Participants can be unsubscribed (upon request or due to inactivity) or officially excluded (e.g. based on no-show exclusion rules)Footnote 12 from further experiment invitations.Footnote 13
An issue with subject pools is multiple profile registrations by the same participants. ORSEE allows experimenters to require any participant profile field to be unique* within the database, such that a new participant profile cannot be created when there already exists a profile with the same value in that field. Examples include student ID (which can also be easily verified at the lab check-in), or email-address.
A comprehensive statistics section provides summary data about the subject pool. The statistics page includes tables and figures on the demographic data obtained from subjects, and the general experiment participation pattern. These statistics can be ‘browsed’*, that is, the tables and figures can be broken down for any subset of participants defined by particular demographics.
2.9 Other features and functions for researchers/administrators
Administrators (with the respective role and privileges) can at any time add new researcher/administrator accounts or disable accounts of inactive researchers. Researchers can be assigned to roles in the system, which are defined by detailed access rights to the functions of the system.
The ‘internal’ experiment calendar provides information about any laboratory bookings, including the times and enrollment status of laboratory sessions. Experiment and session edit pages as well as participant lists can also be directly accessed from the calendar. A researcher or administrator may also add any other laboratory bookings, or information about experiment sessions which are not organized within ORSEE (e.g. when sessions in the same laboratory are conducted with a separate course-credit subject pool).
When the feature is enabled, ORSEE can track payments to participants.* Various ‘currencies’ (such as cash, amazon gift certificates, course credit) can be created. Experiments and sessions can be charged to different budgets, and financial reports show all payments allocated to a particular budget.
The Downloads section allows access to general and experiment-specific files that have been uploaded by users. For example, general files can include the ORSEE user manual or the laboratory’s rules for experimenters, while experiment-specific files can include instructions, experiment software programs, or even raw data files.Footnote 14 All actions of participants and experimenters in ORSEE (and all regular tasks run by the system) are recorded to the database. These can be inspected in the statistics/logs section, which also includes access statistics from the webserver logs.
3 Customization options
ORSEE is intended to be a flexible system, adapting to different recruitment procedures and allowing customization at various levels. In terms of configuration and customization, upon installation or at a later time, one can:
-
Add multiple laboratories;
-
Define different sub-subject pools (e.g. students, professionals), which participants can select upon registration, and subsequently be presented with a pool-specific participant profile form;
-
Define roles (administrator, researcher, research assistant, visitor, etc.) by assigning access rights to the various functions of the system to each role;
-
Create ‘external’ experiment types that are mapped to the ‘internal’ experiment types of a laboratory experiment, online survey, and Internet experiment;Footnote 15
-
Determine whether subjects access their account and enrollment page via a token included in the URL or by entering a username and password*;
-
Customize the participant profile form and the data fields in that form;*
-
Define different participant statuses and participation statuses;
-
Enable/disable an add-on with the tracking of participant payments;*
-
Enable/disable some automatic processes (e.g. exclusion from the database based on no-shows) and define rules for them;
-
Add languages other than the pre-existing ones (English and German), import languages created by other users, and modify any expressions shown in the public or administration part of the webpage (including e-mail templates, etc.);
-
Customize templates for all e-mails that the system sends out;
-
Add or edit Frequently Asked Questions and the answers provided on the public webpage;
-
Customize the layout of the webpage (both in simple ways by just changing colors* and in more sophisticated ways by editing style templates), the content of all public pages, as well as which pages are shown;
-
Define default values for many form fields within the system; and
-
Import data from old ORSEE versions into a new version.*
4 Data security
In order to protect the data collected in the application, ORSEE uses a variety of measures. To prevent SQL injection attacks, ORSEE uses a database/SQL framework with ‘parametrized queries’, in which queries and their parameters (that often depend on user input) are transferred separately to the database server. This way, queries cannot be manipulated in a harmful way. In addition, any user-provided input entered in the publicly-exposed part of ORSEE is escaped and sanitized before being processed, in order to prevent any malicious code from execution.
Passwords of researchers and students are only stored in encrypted form, using latest one-way encryption technologies. ORSEE allows administrators to define minimum password strength requirements for participant and researcher passwords using regular expressions. The researcher/administrator can set up ORSEE such that participants can access their profile either using a randomly generated access token in an URL (ORSEE’s ‘traditional method’, similar to how airlines use booking codes), or only with username/password. For previous users of ORSEE, the new version provides a ‘migration’ setting in which token-URLs still work for initial authentication, but require participants to choose a password upon their next access.
Participants are responsible for protecting their own profile and data by keeping their individualized token-URL or their username and password (depending on which participant authentication method is used) secret. Administrators and experimenters have access to a much larger set of data, and should choose their passwords with care.
5 Technical details and license
ORSEE is written in PHP and uses a MySQL database backend.Footnote 16 ORSEE assumes some standard PHP packages to be installed (namely: php5-gd, php5-mysql, php5-mbstring), and needs Webalizer to be installed if web log analysis is required. Other than that, ORSEE comes shipped with all the packages that it may require (e.g. jQuery, functions for e-mail communication and PDF production, etc.).
For software bugs and feature requests, the preferred method of communication are the respective trackers on sourceforge.net/projects/orsee. For other questions, please use e-mail.
The Online Recruitment System for Economic Experiments is ‘Citeware’ and is available under a proprietary open source license. ORSEE is free of charge. However, researchers using the software to organize their experiments must acknowledge the software’s use by an appropriate citation of this article.
Notes
As of current date, ORSEE is used by more than 150 universities world-wide. By now, there are a number of other web recruitment softwares, including eRecruit, exLab (inactive), hRoot, and MooreRecruiting/CASSELWeb3. SonaSystems is a very popular recruitment tool in experimental psychology.
With respect to selection into the pool, Krawczyk (2011) found that emphasizing monetary payments attracted more students to sign up in an ORSEE database. Other studies suggested that paying volunteering subjects might increase their representativeness (Rush et al. 1978; Wagner and Schubert 1976). Some experimental psychologists argued that using undergraduate subject pools based on course credit averts non-representativeness due to self-selection of volunteers (Dixon 1978; Jackson et al. 1989; Jung 1969; Rosenthal and Rosnow 1975).
Participation history and details of recruitment procedures may also play a role in selection. Casari et al. (2007) observed that participants who made more money in a common-value auction experiment were more likely to sign up for a further auction experiment. Some studies in psychology have shown that a description and even just the name of an experiment can have an impact on subjects’ self-selection to experiments (Jackson et al. 1989; Senn and Desmarais 2001; Saunders et al. 1985; Silverman and Margulis 1973). Coutts and Schneider (1975) found no overall effects of the gender of the experimenter or the subject. However, male volunteers were more likely to show up when recruited by a male experimenter, while Senn and Desmarais (2001) observed that both male and female subjects were more likely to sign up for a sex research study when recruited by a male person.
Definition of terms used throughout this paper: a ‘session’ is defined as running an experiment at a particular time at a particular location. A ‘researcher’ or ‘experimenter’ is a person who conducts and/or administrates an experiment. A ‘subject’/‘participant’ is a person who is recruited to participate in an experiment. Using ORSEE, experimenters schedule (often multiple) sessions for laboratory experiments and invite subjects to participate. Invited subjects may enroll for one of the experiments’ sessions in order to participate.
ORSEE allows the complete customization of the participant profile form. Any field can be added, in various question types. Fields can be conditional on the sub-subject pool. A form template allows the customization of the form layout. Any profile data fields are automatically included in participant query builders and in participant lists (see below).
Activities described below can be centralized by assigning respective user privileges. For example, an administrator could take over some tasks like scheduling sessions in the laboratory, while experimenters have viewing rights only, but not adding or editing rights.
ORSEE allows to enable/disable tracking of the ethics approval of studies. If enabled, the administrator/researcher can enter the ethics approval details (e.g. approval number) and expiry date, and a warning will appear if no approval has been entered or when the approval has expired for upcoming sessions. While ethics regulations in some laboratories/universities may allow experimenters to run the recruitment database without formal ethics approval (since enrolling merely implies expression of interest in receiving invitations, and ‘recruitment’ only starts with sending invitations for a particular experiment), ethics regulations in other laboratories may require ethics approval for the establishment of the database itself. In any case, experience shows that the existence of a structured, standardized way to recruit participants with ORSEE makes it easier for researchers to communicate with ethics panels about subject recruitment.
In ORSEE, eligibility is defined at the experiment level. If researchers need to run different sessions with different participant characteristics (e.g. some with all male and some with all female subjects), they can create different experiments for those sessions and mutually exclude participants. If mixed sessions are needed (e.g., half males and half females in a session), the researcher can create two experiments (one for males, one for females), and then schedule parallel half-sized sessions for each experiment.
Example query: select female participants who have not yet participated in a ‘public good’ experiment.
Example query: select all participants who are [(business students AND male) OR (arts students AND female)] AND have zero no-shows AND participated in more than 1 experiment AND participated in less than 5 experiments previously.
Participation statuses can be freely defined. Examples include no-show, turned-away, late-show, participated, etc.
Further ‘participant statuses’* can be configured. For example, one could add a status ‘grey-listed’, the holder of which is not officially excluded but will also not be invited for further studies.
Profile records cannot be deleted from the system, as the internal ID and existence of an entry is necessary for database integrity. However, the fields in a participant’s profile form can be emptied if necessary.
After recent data-fabrication cases in social psychology and a wide discussion about data fabrication, data selection, and partial reporting issues in experimental communities both in psychology and in economics, some laboratories now require researchers to upload their raw data files to a data repository immediately after each session. ORSEE can facilitate such policies.
As an example, there could be three ‘external’ types of experiments for which participants can express their interest to be invited: laboratory experiments, eye-tracker experiments, and online studies. In the ORSEE configuration, the external types laboratory experiments and eye-tracker experiments are mapped to the internal type ‘laboratory’, because both are organized like laboratory experiments. The external type online studies is mapped to both ‘online-survey’ and ‘internet-experiment’, since participants who sign up for this external type may be invited to either internal type of experiment.
While ORSEE is developed on Linux, it has been successfully installed on Windows servers or MacOSX operating systems in a number of laboratories and universities. Windows may require some minor adaptations, e.g. a replacement for Linux’s cron daemon in order to execute regular tasks.
References
Casari, M., Ham, J., & Kagel, J. (2007). Selection bias, demographic effects and ability effects in common value auctions experiments. American Economic Review, 97(4), 1278–1304.
Cleave, B., Nikiforakis, N., & Slonim, R. (2013). Is there selection bias in laboratory experiments? The case of social and risk preferences. Experimental Economics, 16(3), 372–382.
Coutts, L. M., & Schneider, F. W. (1975). Recruitment of experimental subjects. Perceptual and Motor Skills, 41, 142.
Dixon, P. N. (1978). Subject recruitment incentives, personality factors, and attitudes towards experimentation in a simultaneous intentional-incidential learning task. Journal of General Psychology, 99, 99–105.
Falk, A., Meier, S., & Zehnder, C. (2013). Do lab experiments misrepresent social preferences? The case of self-selected student samples. Journal of the European Economic Association, 11(4), 839–852.
Harrison, G., Lau, M., & Rutström, E. (2009). Risk attitudes, randomization to treatment, and self-selection into experiments. Journal of Economic Behavior and Organization, 70, 498–507.
Jackson, J. M., Procidano, M. E., & Cohen, C. J. (1989). Subject pool sign-up procedures: a threat to external validity. Social Behavior and Personality, 17, 29–43.
Jung, J. (1969). Current practices and problems in the use of college students for psychological research. The Canadian Psychologist, 10, 280–290.
Krawczyk, M. (2011). What brings subjects to the lab? Experimental Economics, 14(4), 482–489.
Rosenthal, R., & Rosnow, R. (1975). The volunteer subject. New York: John Wiley & Sons.
Rush, M. C., Phillips, J. S., & Panek, P. E. (1978). Subject recruitment bias: the paid volunteer subject. Perceptual and Motor Skills, 47, 443–449.
Saunders, D. M., Fisher, W. A., Hewitt, E. C., & Clayton, J. P. (1985). A method for empirically assessing volunteer selection effects: recruitment procedures and responses to erotica. Journal of Personality and Social Psychology, 49, 1703–1712.
Senn, C. Y., & Desmarais, S. (2001). Are our recruitment practices for sex studies working across gender? The effect of topic and gender of recruiter on participation rates of university men and women. Journal of Sex Research, 38, 111–117.
Silverman, I., & Margulis, S. (1973). Experiment title as a source of sampling bias in commonly used ‘subject pool’ procedures. The Canadian Psychologist, 14, 197–201.
Slonim, R., Wang, C., Garbarino, E., & Merrett, D. (2013). Opting-in: participation bias in economic experiments. Journal of Economic Behavior and Organization, 90, 43–70.
Wagner, M. E., & Schubert, D. S. P. (1976). Increasing volunteer representativeness by recruiting for credit or pay. The Journal of General Psychology, 94, 85–91.
Author information
Authors and Affiliations
Corresponding author
Additional information
I thank Johannes Hoelzemann for research assistance, Ashneil Roy for support in the programming of some of the user interfaces, and ORSEE users all over the world for their comments and support. Financial support from the Australian Research Council and the UNSW Experimental Business Research Laboratory is gratefully acknowledged.
Rights and permissions
About this article
Cite this article
Greiner, B. Subject pool recruitment procedures: organizing experiments with ORSEE. J Econ Sci Assoc 1, 114–125 (2015). https://doi.org/10.1007/s40881-015-0004-4
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40881-015-0004-4