Abstract
Research has shown that the use of digital technologies in the personnel selection process can have both positive and negative effects on applicants’ attraction to an organization. We explain this contradiction by specifying its underlying mechanisms. Drawing on signaling theory, we build a conceptual model that applies two different theoretical lenses (instrumental-symbolic framework and justice theory) to suggest that perceptions of innovativeness and procedural justice explain the relationship between an organization’s use of digital selection methods and employer attractiveness perceptions. We test our model by utilizing two studies, namely one experimental vignette study among potential applicants (N = 475) and one retrospective field study among actual job applicants (N = 335). With the exception of the assessment stage in Study 1, the positive indirect effects found in both studies indicated that applicants perceive digital selection methods to be more innovative. While Study 1 also revealed a negative indirect effect, with potential applicants further perceiving digital selection methods as less fair than less digitalized methods in the interview stage, this effect was not significant for actual job applicants in Study 2. We discuss theoretical implications for the applicant reactions literature and offer recommendations for human resource managers to make use of positive signaling effects while reducing potential negative signaling effects linked to the use of digital selection methods.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Digital selection methods are playing an increasingly important role in human resource departments around the world (Ryan et al., 2015; Stone et al., 2013; van Esch et al., 2019; Williams et al., 2021; Woods et al., 2020). Many organizations screen and evaluate applicants’ social network profiles (e.g., LinkedIn) instead of asking them to send their curriculum vitae (CV), or they use web-based tests and video interviews instead of arranging on-site tests and face-to-face meetings (Tippins, 2015); some early-adopting organizations have even started experimenting with chatbots to replace human interviewers (Moran, 2018). The introduction of digital technologies in personnel selection processes has the potential to help organizations select the best talent from increasingly large and sometimes global pools of applicants (Stone et al., 2015). By facilitating the efficient processing of large numbers of applicants, digital technologies can potentially save both money and time for organizations, as well as applicants (McCarthy et al., 2017; Stone et al., 2015).
However, another, often unintended, effect of digital selection methods may be their influence on applicants’ perceptions of organizations themselves (Ployhart, 2006; Stone et al., 2013), particularly their judgments of its attractiveness (Bauer et al., 2004). From a signaling theory perspective (Bangerter et al., 2012; Connelly et al., 2011; Spence, 1973), digital technologies in selection processes can be assumed to send signals about an organization during the pre-entry phase. In support of this notion, digital technologies in selection processes have been shown to influence applicants’ impressions and, as a result, their attraction to the organization as a potential employer (Chapman et al., 2005; McCarthy et al., 2017; Uggerslev et al., 2012). If applicants perceive these signals to be negative, they might lose interest in the organization and eventually self-select themselves out of the recruitment process (Hausknecht et al., 2004). In support of this notion, a study by LinkedIn showed that 83% of interviewed applicants changed their minds about an organization that they once liked when they developed negative impressions during the selection process (Gager et al., 2015).
To date, we know that the signals that are sent by digital technologies in the personnel selection process (Roulin & Bangerter, 2013; Straus et al., 2001) can have both positive and negative effects on applicants’ attraction to an organization (Chapman et al., 2003; McCarthy et al., 2017). However, our understanding of the mechanisms that link the use of digital technologies in the selection process to applicants’ attraction to the organization is still limited (Breaugh, 2013; McCarthy et al., 2017). Signaling-based models in personnel selection research have been criticized for their lack of conceptual specifications and empirical testing regarding the specific inferences that people draw from digital technologies (Breaugh, 2008; Celani & Singh, 2011; Jones et al., 2014). Understanding these inferences is particularly important to explain why digital technologies simultaneously send both positive and negative signals.
Two theoretical perspectives offer indications of potential mechanisms. Regarding positive signals, (1) the instrumental-symbolic framework presents innovativeness as one of the most important signals for increasing employer attractiveness (Lievens & Highhouse, 2003). Digital technologies constitute recent innovations (Parasuraman, 2000); this fact implies that the use of digital selection methods can deliver a positive signal about the innovativeness of the organization. In contrast, previous research on personnel selection indicates negative signals from the theoretical lens of (2) procedural justice (e.g., Gilliland, 1993, 1994). As digital technologies reduce personal interactions (McCarthy et al., 2017), are more standardized (Chapman & Webster, 2001), and raise issues regarding privacy (e.g., Bauer et al., 2006), applicants may perceive selection processes based on digital technologies to be less procedurally fair.
In sum, drawing on signaling theory (Bangerter et al., 2012; Connelly et al., 2011; Spence, 1973), this research aims to clarify the effect of digital selection methods on applicants’ perceptions of an organization’s attractiveness as a prospective employer by examining potentially positive effects via innovativeness and potentially negative effects via procedural justice. In doing so, we aim to contribute to the applicant reactions literature in three distinct ways. First, we address recurring calls to keep pace with the technological changes in personnel selection practices (Anderson, 2003; McCarthy et al., 2017; Woods et al., 2020) by comparing applicants’ preferences in selection methods that incorporate recent digital technologies with their preferences in traditional methods that are not digitalized or are characterized by a low degree of digitalization. Second, by applying signaling theory as a basis for explaining how digital selection methods affect applicants, we expand the theoretical lens of the applicant reactions literature (McCarthy et al., 2017), which has mainly focused on Gilliland’s (1993) organizational justice theory-based framework. Third, we also amend the lens of organizational justice theory (Gilliland, 1993) by addressing recent calls to examine the mechanisms that link the use of digital selection methods to applicants’ attraction to organizations (Harold et al., 2016). In deriving mechanisms based on two different theoretical lenses—the perspectives of innovativeness and procedural justice—to explain both positive and negative signals, we specifically introduce the instrumental-symbolic framework (Lievens & Highhouse, 2003) to extend research on applicant reactions to digital technologies in personnel selection methods.
In addition to these theoretical contributions, this research provides practitioners with a better understanding of the specific signals that they send by applying digital selection methods. Understanding these signals can help human resource managers limit the potential negative effects arising from digital selection methods while highlighting their positive effects on employer attractiveness perception to attract and retain the most talented applicants.
We test our model by analyzing potential applicants in an experimental vignette study and real applicants in a field study to combine the advantages of experimental designs, specifically their enhanced control, and of field studies, for their greater potential for generalizability (e.g., Anderson et al., 1999; Bauer et al., 2006; Ryan & Ployhart, 2000). In the case of the vignette experiment, we followed the suggestion of Uggerslev et al. (2012) to separately examine the different stages (i.e., application and screening, assessment, and interview stages) of the entire personnel selection process. We thereby aim to examine whether the three proposed mechanisms are present in each of the three stages.
Theoretical Background and Hypotheses Development
Digital Selection Methods
Digital selection methods can be defined as personnel selection methods that are mediated by digital communication technologies (Woods et al., 2020), such as social media, mobile media, the Internet, analytics, cloud, artificial intelligence, or algorithmic decision making (Vial, 2019). Even though the use of personnel selection methods that are to some degree considered digital is now common for most companies around the world (Nikolaou et al., 2019), we can differentiate the degree to which these methods are digitalized. The degree of digitalization varies based on the number of digital communication technologies used in the selection method, as well as on whether digital technologies are only facilitators of a traditional selection method or are at the core of the design of a selection method (Landers & Marin, 2021). For instance, most companies already rely on Internet-based online application systems, where candidates can upload their resumes (Woods et al., 2020). However, this method can be considered less digital than a requirement to upload a link to a professional social media profile (e.g., LinkedIn), which includes similar content to a standard resume, because the latter incorporates two digital communication technologies (social media and Internet) instead of only one. Moreover, online application systems use digital technologies solely to facilitate the transfer of the resume from the applicant to the organization, while digital technologies are at the core of the design of social media profiles.
Generally, organizations use personnel selection methods to assess whether applicants have the knowledge, skills, abilities, and other characteristics required to perform well in the position for which they applied (Nikolaou et al., 2019; Ryan & Ployhart, 2000). Most organizations use more than one selection method in the selection process before they make their decisions. Hence, the typical personnel selection process covers several steps, including the application and screening stage, the assessment stage, and the interview stage (Stone et al., 2013). For each of these stages, organizations can choose from a variety of methods that differ not only in terms of their content but also in terms of their degree of digitalization. In the application and screening stage, for instance, traditional selection methods with low degrees of digitalization include CVs and cover letters (resumes), personal references, or biodata (Steiner & Gilliland, 1996). Selection methods with high degrees of digitalization include blockchain resumes, analyses of social media profiles (Hartwell & Campion, 2020; Ingold & Langer, 2021; Tews et al., 2020), or video resumes (Hiemstra et al., 2012). Traditional assessment methods with a low degree of digitalization include work sample tests conducted on the premises of the organization, written (paper-and-pencil) cognitive ability tests, personality tests, situational judgment tests, or assessment centers (Macan et al., 1994; Ryan & Ployhart, 2000; Steiner & Gilliland, 1996). On the other hand, online work sample simulations (Tippins, 2015), gamified online assessments (Armstrong et al., 2016; Buil et al., 2020), web-based cognitive ability tests (Potosky & Bobko, 2004), computational personality assessments (Stachl et al., 2021), or online-based situational judgment tests (Woods et al., 2020) can be considered assessment methods with a high degree of digitalization. Traditional structured or unstructured face-to-face interviews (Smither et al., 1993) are selection methods used at the interview stage with the lowest degree of digitalization. Telephone interviews (Bauer et al., 2004) can be considered more digital than face-to-face interviews but still have a low degree of digitalization. Interview selection methods with a higher degree of digitalization include videoconferences, which make use of the Internet and video processing software (Basch et al., 2020). More recently, organizations have increasingly used interview methods with an even higher degree of digitalization, such as asynchronous job interviews (Hiemstra et al., 2019) or digital interviews with a virtual chatbot interviewer (Langer et al., 2019).
A Signaling Perspective on Applicants’ Perceptions of Digital Selection Methods
Spence (1973) introduced signaling theory as a general framework to explain how two parties with imperfectly aligned interests and incomplete information cooperate with each other. The framework has been applied in various management disciplines, such as strategic management, entrepreneurship, organizational behavior (see Connelly et al., 2011), and human resource management (particularly in recruitment and selection; e.g., Jones et al., 2014; Roulin & Bangerter, 2013; Wilhelmy et al., 2018).
In the case of applicant reactions to selection processes, signaling theory suggests that applicants use the information they receive about an organization as indicators of organizational characteristics (Bangerter et al., 2012; Ehrhart & Ziegert, 2005; Ryan et al., 2000; Rynes et al., 1991). For example, Turban (2001) found that individuals use attributes of recruitment and selection activities, such as the design of or methods used in the selection process, as signals of overall organizational characteristics. Based on these signals, applicants, who typically have little information about the recruiting organization (Rynes et al., 1991), form impressions of the organization as a potential employer (Celani & Singh, 2011; Suazo et al., 2009). These impressions or inferences are signaling mechanisms that directly influence signaling outcomes, i.e., how job seekers’ attitudes toward an organization and affect their choices (Cable & Turban, 2003; Jones et al., 2014; Rynes et al., 1991).
To determine the specific signaling mechanisms, i.e., how the signals provided by digital selection methods influence perceptions of employer attractiveness, we draw upon research on employer image and procedural justice. Applying these two theoretical lenses, we hypothesize that innovativeness is a positive signaling mechanism and procedural justice is a negative signaling mechanism. Figure 1 depicts the resulting theoretical model.
Innovativeness
Lievens and Highhouse (2003) introduced the instrumental-symbolic framework, which posits that applicants form an image of an organization as an employer based on two types of information conveyed to them during recruitment and selection: instrumental characteristics (i.e., factual information such as payment; Wilhelmy et al., 2018) and symbolic meanings (i.e., intangible characteristics such as personality traits; Slaughter et al., 2004; Wilhelmy et al., 2018). Researchers have shown that even though instrumental characteristics are important to potential applicants, symbolic meanings have a stronger influence on the image that applicants form about an organization (e.g., Lievens, 2007; Lievens & Highhouse, 2003).
One important symbolic value that applicants rely on when building an image of a potential employer is its innovativeness (Lievens & Highhouse, 2003; Slaughter et al., 2004). Innovativeness is an organization’s capability to continuously reinvent its systems, products, and services and the key to organizational success and long-term survival (Moss et al., 2015). Hence, by sending signals of innovativeness, organizations can show applicants that they are well-prepared for the future and therefore an attractive employer.
From the marketing and service literature, we know that consumers perceive organizations that make use of new (digital) technologies in their business processes as more innovative, which has a positive effect on their image (Parasuraman, 2000). In the context of employee selection, the use of digital technologies might likewise influence an employer’s image, as it signals that the organization keeps pace with technological innovations and uses novel and exciting methods (Tippins, 2015). As many applicants may not have experience with highly digitalized selection methods from previous selection procedures, they may perceive such methods as new and innovative.
We therefore argue that by using selection methods with high degrees of digitalization, organizations can send signals regarding their innovativeness.
-
Hypothesis 1: The use of selection methods with high degrees of digitalization has a positive effect on applicants’ perceptions of innovativeness.
Procedural Justice
Gilliland’s (1993) original applicant reactions model, which is based on organizational justice theory, posits that procedural justice or fairness mediates the relationship between characteristics of the selection system and applicant reactions. The concept of procedural justice refers to the fairness of rules and procedures that are used by organizations in making personnel selection decisions (Hausknecht et al., 2004). According to the theory of procedural justice, perceptions of the overall fairness of selection procedures can be impaired when they are not applied consistently across candidates and time, are not free from bias, do not ensure that decisions are based on accurate information, do not have mechanisms to ensure the accuracy of decisions, do not conform to ethical or moral standards, or do not ensure that the opinions of all groups affected by the decision have been considered (Colquitt et al., 2001; Leventhal, 1980).
Research shows that procedural justice perceptions of applicants might change throughout the selection process, as applicants have varying expectations in each stage (Konradt et al., 2020). Hence, an examination of the fairness perceptions of selection methods with a high degree of digitalization compared to those with a low degree of digitalization in consideration of the respective stage of the application process appears to be meaningful.
In the application and screening stage, selection methods with high degrees of digitalization offer applicants the opportunity to add more information about themselves due to the higher media richness of these methods compared to more traditional methods (Hiemstra et al., 2019). For instance, by providing a link to their social media profile, applicants provide information that exceeds the information conveyed by a traditional CV, such as social media posts or likes, which can be used to capture a more holistic picture of the applicant’s character (Youyou et al., 2015). Similarly, video applications can add more in-depth information about the applicant due to their supplemental visual and auditory information (Hiemstra et al., 2012). However, research indicates that applicants do not perceive adding more information as an additional opportunity in the application and screening stage but rather have concerns regarding the fairness of methods with high degrees of digitalization. Ingold and Langer (2021), for instance, found that social media resumes are perceived as less fair than traditional resumes. This finding is in line with Stoughton et al. (2015), who showed that social network screening decreases applicants’ fairness perceptions in the selection process. Similarly, Hiemstra et al. (2019) found that applicants perceived video applications as less fair than traditional application methods. Hence, it appears that the opportunity to provide more information in the application and screening stage is overshadowed by applicants’ concerns that the recruiting organization might also base their decisions on other non-job-related information, which is often revealed in methods with higher degrees of digitalization (Tews et al., 2020).
The use of assessment methods with high degrees of digitalization, such as Internet-based assessment tests with algorithmic decision making, has the advantage of being consistent in analyzing the data provided by applicants.Footnote 1 However, without further explanation of how the algorithm makes its decisions, applicants might raise concerns that the data the algorithm is based on might not be free of biases (Cheng & Hackett, 2021). Moreover, assessment tests with high degrees of digitalization are mostly administered unproctored (Nikolaou et al., 2019) and therefore cannot guarantee that applicants will represent themselves honestly, which may lead to potential inaccuracies in decision making and therefore impair fairness perceptions. Research also indicates that procedural justice perceptions of assessment methods with high degrees of digitalization might suffer when there is the possibility of technical problems (e.g., network disruptions during web-based assessment tests; Harris et al., 2003).
Research investigating differences in applicants’ reactions to technology-mediated interviews in comparison to traditional interviews consistently reveals that applicants generally react more favorably to face-to-face interviews (Bauer et al., 2004; Blacksmith et al., 2016; Chapman et al., 2003). More specifically, studies show that applicants perceive interviews with high degrees of digitalization, such as asynchronous videos or robot-mediated interviews, as less fair than traditional face-to-face interviews (Hiemstra et al., 2019; Nørskov et al., 2020). Even the inclusion of information explaining the procedure of the selection method does not necessarily lead to higher fairness perceptions for interview methods with high degrees of digitalization (Langer et al., 2018). While these results might be due to missing interpersonal exchange in the context of digital interviews, Langer et al. (2017) and Suen et al. (2019) investigated whether there are fairness perception differences in different forms of digital interviews by comparing asynchronous digital interviews with videoconference interviews (e.g., Zoom interviews). However, they found no significant differences. An explanation might be that applicants develop the perception that selection methods with high degrees of digitalization, especially in the interview stage, present them with fewer opportunities to leave positive impressions (Basch et al., 2020; Stone-Romero et al., 2003). Furthermore, researchers have proposed that due to the higher personal interaction involved in less digitalized interview methods, adopting such methods might signal to applicants that the organization cares about them, whereas the application of highly digitalized interview methods might raise concerns that the organization is more interested in cutting costs and increasing efficiency (Acikgoz et al., 2020; Stone et al., 2013).
Overall, for many applicants, selection methods with high degrees of digitalization are unfamiliar; therefore, applicants might be more prone to question the fairness and equitability of these methods (Lukacik et al., 2020). Hence, we expect that organizations applying methods with high degrees of digitalization throughout the entire selection process send negative signals regarding the fairness of their selection procedures.
-
Hypothesis 2: The use of selection methods with high degrees of digitalization has a negative effect on applicants’ procedural justice perceptions.
Linking Innovativeness and Procedural Justice Perceptions of Digital Selection Methods to Employer Attractiveness
The theoretical model of applicant reactions to selection processes posits that applicants’ perceptions during the selection process have several predictors, such as procedural characteristics, which are in turn related to attitudes toward the organization (e.g., employer attractiveness) (Gilliland, 1993; Hausknecht et al., 2004; McCarthy et al., 2017; Ryan & Ployhart, 2000). Specifically, previous research has shown that the impression of an organization that applicants form during the selection process is one of the strongest predictors of applicants’ attraction to it (Chapman et al., 2005; Wehner et al., 2015). When applicants perceive a selection process as innovative, they might form the impression that the organization is not only a pioneer in its market but also has a highly innovation-oriented culture (Sommer et al., 2017). While an innovation-oriented culture might increase the attractiveness of the organization directly (Backhaus & Tikoo, 2004; Sommer et al., 2017), it might also signal the organization’s future prosperity. More specifically, applicants who perceive the selection process to be innovative may get the impression that the company is innovative in general and therefore (1) is capable of adapting to changing environments, which is a strong predictor of longevity (Piao, 2010), and (2) also provides opportunities for personal growth for its employees (Herman & Gioia, 2000; Tsai & Yang, 2010). In their research on applicants’ initial attraction to potential employers, Slaughter and Greguras (2009) suggested that “organizations would do well to portray images of their organization as being highly innovative” (p. 13) to be more attractive for applicants. Indeed, Sommer et al. (2017) found empirical evidence for this suggestion by showing that perceptions of organizational innovativeness have a positive effect on employer attractiveness perceptions among applicants.
Even though applicants might differ in their reactions to digital selection methods as well as innovation due to varying degrees of individual technical competence (Wiechmann & Ryan, 2003) or personality characteristics (e.g., openness to change), we suggest that, overall, the utilization of selection methods with high degrees of digitalization has a positive effect on employer attractiveness for the following reasons. First, previous research supports the argument that innovativeness perceptions predict employer attractiveness perceptions in different contexts (Highhouse et al., 2003; Lievens & Highhouse, 2003; Slaughter et al., 2004; Sommer et al., 2017). Second, while previous research has shown that personality characteristics can moderate the relationship between innovativeness perceptions and employer attractiveness, signals of innovativeness are also strong positive predictors of organizational attractiveness independent of personality characteristics (Sommer et al., 2017). Hence, we expect that innovativeness is a key driver of employer attractiveness and therefore indirectly affects the relationship between the use of selection methods with high degrees of digitalization and employer attractiveness.
-
Hypothesis 3: The use of selection methods with high degrees of digitalization has a positive indirect effect on employer attractiveness via applicants’ innovativeness perceptions.
Concerning procedural justice, we know that employer attractiveness perceptions are positively related to procedural justice perceptions (Ababneh et al., 2014; Hausknecht et al., 2004; Uggerslev et al., 2012). Moreover, previous research on applicants’ reactions to technology-mediated personnel selection methods indicates that procedural justice perceptions might mediate the relationship between the degree of digitalization of selection methods and employer attractiveness perceptions (e.g., Acikgoz et al., 2020; Langer et al., 2019). We build on this research and expect that applicants’ negative procedural justice perceptions of selection methods with high degrees of digitalization lead to negative perceptions about the fairness of an organization in general, which consequently dampens their attraction to the organization (Bauer et al., 1998; Macan et al., 1994).
In sum, we expect that applicants’ negative procedural justice perceptions of selection methods with high degrees of digitalization indirectly affect the relationship between selection methods and employer attractiveness.
-
Hypothesis 4: The use of selection methods with high degrees of digitalization has a negative indirect effect on employer attractiveness via applicants’ perceptions of procedural justice.
Study 1: Experimental Vignette Study
We applied an online experimental vignette study, which allowed us to make causal inferences about applicants’ perceptions of digital methods in the selection process (Aguinis & Bradley, 2014; Atzmüller & Steiner, 2010). At the same time, we ensured that all participants were provided with a realistic description of the selection process and sufficient contextual information, which is essential when employing a between-subjects design in a vignette study (Aguinis & Bradley, 2014; Atzmüller & Steiner, 2010).
Method
Design and Procedure
After giving their consent to participate in the study, participants were provided with a scenario and the accompanying contextual information. We told the participants that we were interested in their first impression of a hypothetical selection process composed of three steps: (1) application (submission) and screening, (2) assessment test, and (3) job interview. We employed a 2 × 2 × 2 between-subjects design and randomly assigned participants to one of the resulting eight hypothetical scenarios. The three factors were the level of digitalization (high, low) in each of the three stages of the selection process. By checking for interactions between factors (Atzmüller & Steiner, 2010; Dülmer, 2016), we were able to additionally test for accumulative and consistency effects of selection methods with high degrees of digitalization. After reading the scenario, participants answered a short survey that included our dependent variables. We provided the scenario descriptions and the questionnaire in German and English languages. We designed all materials in the English language and translated them using back-and-forth translation (Brislin, 1970). Of all participants, 23.81% chose to answer in English.
Sample
Participants were potential job applicants (N = 504), i.e., adults who were in the application process during the time of data collection or who considered applying for a new job in the near future. All participants were recruited online by posting the survey link on social networks (LinkedIn, Facebook, Reddit), sending it to researchers’ contacts, and profiting from snowball sampling. Due to logical inconsistencies in answers between the number of applications in the last two years and the last application within the last two years, we excluded 22 participants from our sample prior to conducting the analyses. Additionally, following the recommendations of Meade and Craig (2012), we applied two careless response detection methods. After examining outliers in the response time, as well as response patterns with which participants consistently indicated the same answer, we removed another seven respondents from the sample prior to analysis. We used an online survey tool that randomly assigned participants to one of the eight scenarios.
Our final sample (N = 475) consisted of 57% women. Of these respondents, 319 were students and 156 were professionals. The mean age was 26.26 (SD = 9.97). Respondents with German nationality made up 73.05% of the sample; 6.95% were Polish, 3.79% Singaporean, and 2.95% Austrian; the rest held another nationality. In terms of educational achievements, 78.32% of the sample had a university degree. Among respondents, 88.42% indicated that they participated in at least one selection process in the last two years (Mdn = 3).
Manipulations
We developed manipulations for the treatment conditions by using a prestudy. With this prestudy, we aimed to select one digital (high degree of digitalization) and one nondigital (no or low degree of digitalization) selection method for each of the three stages of the selection process based on participants’ ratings of the degree of digitalization of 21 presumably digital and nondigital selection methods via an online survey. First, we selected nondigital personnel selection methods from previous studies (Smither et al., 1993; Steiner & Gilliland, 1996). These also subsume methods with a very low degree of digitalization (e.g., upload of a written CV to a company’s career portal). Then, we added methods that apply digital technologies and have been increasingly used in practice in the last few years. As a result, the final questionnaire included twelve digital and nine nondigital personnel selection methods. For each nondigital selection method, the pre-study included at least one digital selection method that, apart from making use of digital technologies, was comparable to the nondigital selection method. In sum, we analyzed twelve pairs of personnel selection methods, namely four pairs for the application and screening stage, three pairs for the assessment stage, and five pairs for the interview stage (see Table 7 Appendix 28 for a description of the 21 personnel selection methods).
Participants were students (N = 105) who had already taken part in a personnel selection process or were planning to apply for a job in the near future. The mean age was 25.82 (SD = 5.61) and 56% were women. Participants rated the degree of digitalization (i.e., “This selection process is very digital”) of each personnel selection method on scales ranging from 1 (completely disagree) to 7 (completely agree). We analyzed differences in the pairs of selection methods by applying paired samples t-test analyses. As expected, the means of the degree of digitalization were significantly different (p < 0.001) for all pairs of personnel selection methods with moderate to large effect sizes (Cohen’s d ranges between 1.18 and 4.43). Based on the effect sizes, we chose one pair of selection methods for each stage: social media profile (digital) versus written CV (nondigital) for the application and screening stage; online-based work sample simulation (digital) versus work sample test at one of the facilities of an organization (nondigital) for the assessment stage; and online interview with an animated video chatbot without a prescribed structure (digital) versus personal face-to-face interview without a prescribed structure (nondigital) for the interview stage (see Table 8 Appendix 29 for all verbal descriptions of these selection methods).
Manipulation Checks
In addition to conducting the prestudy, we asked participants in the main study to rate the degree of digitalization of each of the three stages in their scenario to verify whether the manipulations worked. The response format ranged from 1 (not digital at all) to 7 (very digital). The results of independent t-tests showed that the manipulation was successful in all three investigated stages of the selection process. For the application and screening stage, participants rated the social media profile (M = 5.98, SD = 1.28) as more digital than the written CV (M = 4.65, SD = 1.78), t(423) = 9.28, p < 0.001, Cohen’s d = 0.86. Regarding the assessment test stage, participants rated the online-based work sample tests (M = 5.69, SD = 1.55) as more digital than the work sample tests at one of an organization’s facilities (M = 3.07, SD = 1.71), t(473) = 17.49, p < 0.001, Cohen’s d = 1.60. In the case of the job interview stage, participants rated the unstructured chatbot interviews (M = 6.20, SD = 1.46) as more digital than the unstructured face-to-face interviews (M = 2.34, SD = 1.54), t(472) = 28.03, p < 0.001, Cohen’s d = 2.57.
Measures
All measures employed in Study 1 applied 7-point Likert scales.
Innovativeness
To measure perceptions of the innovativeness of the selection process, we adapted three items from Zhao et al. (2012). Participants indicated if they perceived the described selection process as very innovative, very novel, and very original (Cronbach’s α = 0.84).
Procedural Justice
To measure perceptions of overall procedural justice, which is also frequently termed procedural fairness, we adapted three items from Bauer et al. (2001). A sample item was “I think that the selection process is a fair way to select people for the respective job” (Cronbach’s α = 0.88).
Employer Attractiveness
We measured applicants’ perceptions of an organization’s attractiveness as a potential employer by adapting the 4-item organizational attractiveness measure from Ployhart et al. (1999). Specifically, we provided participants with a prompt stating, “In my opinion, based on this selection process, the company as an employer is…”, followed by four semantic differential items: bad − good, unfavorable − favorable, unattractive − attractive, unappealing − appealing (Cronbach’s α = 0.93).
To assess the distinctiveness of our mechanism and outcome variables, we conducted a confirmatory factor analysis. Following the recommendations of Hair et al. (2010), we determined the following: the chi-squared value (χ2); the comparative fit index (CFI), for which values above 0.95 indicate a good fit; the Tucker Lewis index (TLI), with values above 0.95 indicating good fit; and the root mean square error of approximation (RMSEA), for which values that are lower than or equal to 0.08 indicate a reasonable fit. Our hypothesized three-factor model yielded a satisfactory fit to the data: χ2 [32] = 140.42, p < 0.001, CFI = 0.97, TLI = 0.96, RMSEA = 0.08. Moreover, the hypothesized three-factor model fit the data better than a two-factor model with both mediators loading on one common factor (χ2 [34] = 1133.85, p < 0.001, CFI = 0.67, TLI = 0.56, RMSEA = 0.26; Δχ2 [2] = 993.43, p < 0.001), as well as a single-factor model (χ2 [35] = 1085.24, p < 0.001, CFI = 0.68, TLI = 0.59, RMSEA = 0.25; Δχ2 [3] = 944.82, p < 0.001). Additionally, we tested for convergent and discriminant validity of these constructs. The standardized loading estimates and average variance extracted (AVE) estimates of each construct exceeded 0.50, indicating convergent validity (Hair et al., 2010), and the AVE estimates were larger than the shared variance (squared interconstruct correlation) with any other construct, supporting discriminant validity (Fornell & Larcker, 1981).
Results
Table 1 shows the means, standard deviations, and correlations among the study variables.
First, we estimated the direct effects of our theoretical model. To do so, we examined the main effects of the use of selection methods with high versus those with low degrees of digitalization on perceptions of innovativeness (Hypothesis 1) and procedural justice (Hypothesis 2). Table 2 shows the regression results for these direct effects.
According to Hypothesis 1, we expected that potential applicants would perceive selection methods with high degrees of digitalization to be more innovative than selection methods with low degrees of digitalization. This hypothesis was supported for the application and screening stage (b = 0.45, p < 0.001), as well as the interview stage (b = 1.26, p < 0.001), but not for the assessment stage (b = 0.12, p = 0.354).
According to Hypothesis 2, we anticipated that potential applicants would perceive selection methods with high degrees of digitalization to be less fair than methods with low degrees of digitalization. We found support for this hypothesis for the interview stage (b = − 0.95, p < 0.001) but not the application and screening stage (b = − 0.21, p = 0.078) or the assessment stage (b = − 0.15, p = 0.215).
Additionally, we checked for interaction effects between factors. The two-way interaction of the application and screening method and the interview method on innovativeness perceptions was significant (β = − 0.18, p < 0.001). The use of methods with a high degree of digitalization in the application and screening stage had a significant positive influence on innovativeness perceptions when the interview method showed a low degree of digitalization (β = 0.33, p < 0.001) but was not significantly related to innovativeness perceptions when the interview method showed high degrees of digitalization (β = − 0.03, p = 0.585). Thus, the digital interview method appeared to overshadow the digital method in the application and screening stage, removing the positive effect of the latter on innovativeness perceptions. Moreover, the two-way interaction of the assessment method and the interview method on procedural justice perceptions was significant (β = 0.09, p = 0.032). In this case, the use of methods with a high degree of digitalization in the assessment stage had a significant negative influence on procedural justice perceptions when the interview method showed a low degree of digitalization (β = − 0.15, p = 0.016) but was not significantly related to procedural justice perceptions when the interview method showed high degrees of digitalization (β = 0.04, p = 0.544).
To proceed with our hypothesis testing, we examined whether the degree of digitalization of the selection methods indirectly affects employer attractiveness perceptions via perceptions of innovativeness and procedural justice (Hypotheses 3 and 4). We tested the indirect effects by applying the PROCESS macro for SPSS (Hayes, 2018). When testing each of the indirect effects, we controlled for the others; this procedure allows adequate testing of theory and explanatory models (Hayes, 2018; Jones et al., 2014; Preacher & Hayes, 2008). We followed the recommendation of Preacher and Hayes (2008) and used bootstrapping to test for the significance of indirect effects. We report bootstrap estimates based on 5000 bootstrap samples with bias-corrected 95% confidence intervals. Table 3 shows the regression results for the test of the indirect effects.
We found support for Hypothesis 3, which posits that the use of selection methods with high degrees of digitalization indirectly affects employer attractiveness via innovativeness perceptions in a positive way for the application and screening stage (a × b = 0.05, 95% CI [0.011, 0.106]) and the interview stage (a × b = 0.14, 95% CI [0.040, 0.253]) but not for the assessment stage (a × b = 0.01, 95% CI [− 0.015, 0.049]). According to Hypothesis 4, we expect that the use of selection methods with high degrees of digitalization indirectly affects employer attractiveness via procedural justice perceptions in a negative way. We found support for this hypothesis for the interview stage (a × b = − 0.57, 95% CI [− 0.746, − 0.413]) but not for the application and screening stage (a × b = − 0.13, 95% CI [− 0.272, 0.010]) or the assessment stage (a × b = − 0.09, 95% CI [− 0.234, 0.051]).
To provide a more complete picture of these effects, we additionally followed the recommendations of MacKinnon et al. (2012) by also testing and reporting total effects. Having demonstrated opposite indirect effects via innovativeness and procedural justice, we found overall indirect effects that were significantly negative in the interview stage (a × b = − 0.43, 95% CI [− 0.654, − 0.217]) but not in the application and screening stage (a × b = − 0.08, 95% CI [− 0.234, 0.081]) or the assessment stage (a × b = − 0.08, 95% CI [− 0.223, 0.076]). These overall indirect effects constitute partial mediations of the effects on employer attractiveness in the interview stage (c = − 1.15, p < 0.001, c′ = − 0.72, p < 0.001) but not in the application and screening stage (c = − 0.27, p = 0.031, c′ = − 0.20, p = 0.055) or in the assessment stage (c = − 0.22, p = 0.082, c′ = − 0.14, p = 0.152).
Discussion
The results of Study 1 confirm that selection methods with high degrees of digitalization do indeed send signals that influence applicants’ perceptions of employer attractiveness in the application and screening stage, as well as in the interview stage, but not in the assessment stage. Signaling innovativeness led to selection methods with high degrees of digitalization showing a positive indirect effect on potential applicants’ attitudes toward an organization in the application and screening and interview stages. However, lower procedural justice perceptions resulted in a negative indirect effect of digital selection methods on employer attractiveness perceptions, but only in the interview stage.
The lack of significant differences regarding innovativeness in the assessment stage might simply be explained by the fact that many organizations already use online-based assessment tests (Stone et al., 2013; Tippins, 2015). Consequently, potential applicants may no longer perceive this procedure to be an innovative method for selecting new employees. This possibility is also corroborated by the fact that the effects of digital methods in the interview stage overshadowed the effects of digital methods in the application and screening stage. Moreover, applicants may not perceive any significant differences in the fairness of online and offline assessments, as the goal of these assessment tests is generally clear and the procedure is closely related to the job function and consistent for every applicant (Roth et al., 2005), which provides less room for unfair treatment in either case. In contrast to Ingold and Langer (2021), our analysis does not provide evidence for lower fairness perceptions of applicants for links to social media profiles compared to the traditional low-digital CV. These inconsistencies might suggest a potential moderating effect (e.g., privacy concerns), which should be investigated in future studies.
While Study 1 was experimental and therefore provided high internal validity for inferring that selection methods with high degrees of digitalization cause potential applicants’ perceptions, vignette studies remain hypothetical and prospective in nature (Aguinis & Bradley, 2014). Hence, in Study 2, we aimed to examine job applicants’ reactions to real selection processes that they experienced when searching for a job. Examining job applicants’ reactions to real selection processes allows us to extend our findings by assessing whether the effects of selection methods with high degrees of digitalization also apply in retrospect, i.e., after applicants have participated in a selection process.
Study 2: Field Survey
In examining job applicant reactions to digital selection methods in the field, we asked participants to consider the last selection process in which they had reached at least the interview stage. This approach is based on the critical incident technique, where respondents are asked to reminisce about a salient situation (Aquino et al., 2001, 2006; Flanagan, 1954). While a broad range of different digital methods is used in all stages of selection processes in the field, their individual diffusion is relatively low (see Spar et al., 2018; Weitzel et al., 2018). Furthermore, as we had the same hypotheses for each stage, we opted to examine digital selection methods in aggregate. As in Study 1, we used an online survey for Study 2. The survey in Study 2 was conducted in the German language only.
Method
Sample
We collected data from 342 people who had participated in (at least) one selection process in which they had reached the interview stage and asked them to refer to this process when answering the questionnaire. As in Study 1, we distributed the link to our study via social networks (LinkedIn, Facebook, Reddit) and personal networks. To determine our final sample, we used the same tests for logical inconsistencies and careless responses as in Study 1. Seven participants failed these tests and were removed from the sample before we conducted our analyses.
Our final sample (N = 335) consisted of 166 students and 169 professionals. Of these participants, 50% were women, and the mean age was 26.31 (SD = 5.36). Most of the sample consisted of people of German nationality (92.83%), followed by those of Austrian (1.49%) and Swiss (1.19%) nationality; the rest held other nationalities or did not indicate any nationality. In terms of highest educational achievement, 61.19% of the respondents were holders of a university degree, 12.54% had completed an apprenticeship, 12.84% had a high school diploma, and the rest indicated another educational achievement or did not provide this information. The majority of respondents (91.04%) indicated that they had participated in at least one selection process in the last two years (Mdn = 4). Most participants referred to a selection process that was carried out by an established organization (88.36%); 5.37% of the respondents referred to a selection process at a start-up company (age of firm: less than 3 years); the rest indicated another organization type or did not provide any answer. The selection processes to which applicants referred were conducted by organizations from various industries, with the service industry being the most frequently mentioned (22.39%), followed by banking and finance (13.13%). A large proportion of the respondents (84.78%) indicated that they received a job offer from the organization after participating in the selection process. Among these respondents, 91.55% accepted the job offer.
Measure of Digital Selection Process
We provided participants with descriptions of the 12 selection methods with high degrees of digitalization and nine selection methods with low degrees of digitalization that were used in the prestudy (see Table 7 Appendix 28) and asked them to mark those that had been applied in the respective selection process. We also included the option to add other selection methods that the given list did not cover. Overall, participants indicated 55 additional selection methods (e.g., recruiting event, link to applicant’s website, case study assessment test, assessment center, telephone interview). To use that information for further analysis, the first author of this paper coded these methods according to their degree of digitalization (i.e., digital versus nondigital), and another researcher from a German university who is familiar with the field of personnel selection validated the codes. Initial interrater reliability measured via Cohen’s kappa was 76.11%, indicating a good level of agreement (Weathington et al., 2012). All disagreements were resolved by discussing the respective methods and by reaching a joint decision.
As expected, all digital selection methods were at least somewhat used, and 45.37% of all participants indicated the use of at least one digital selection method in the whole selection process. However, as was also expected, the frequencies of the individual digital selection methods were still relatively low, with digital ability tests being used most often (18.21%), followed by online-based work simulations (12.54%) and links to social media profiles (12.54%). Due to the low diffusion of digital methods in each stage, we analyzed applicants’ reactions to the use of selection methods with high degrees of digitalization for the entire process instead of stage-specific analyses because this procedure allowed us to interpret results in a meaningful and reliable way. In addition, there were differences in the number of selection methods that participants experienced per stage, as well as in the overall application process (ranging from two to twelve selection methods in the selection process). Consequently, in contrast to Study 1, instead of choosing between the presence or absence of selection methods with high degrees of digitalization, applicants were presented with the possibility of perceiving methods with low or high degrees of digitalization concurrently in the same stage of the application process. To account for this circumstance, we used the share of digital selection methods throughout the entire process. In operationalizing our predictor as an overall share (percentage) of selection methods with high degrees of digitalization used in the selection process, we calculated the share by dividing the total number of selection methods with high degrees of digitalization by the total number of overall selection methods used in the entire selection process.
Mechanism and Outcome Measures
We used the same measures as in Study 1 for perceptions of innovativeness (Cronbach’s α = 0.80), procedural justice (Cronbach’s α = 0.83), and employer attractiveness (Cronbach’s α = 0.93).
Similar to Study 1, we conducted a confirmatory factor analysis to assess the distinctiveness of our mechanism and outcome variables. Our hypothesized three-factor model yielded a good fit to the data: χ2 [32] = 56.01, p = 0.004, CFI = 0.99, TLI = 0.98, RMSEA = 0.05. The hypothesized three-factor model fit the data better than a two-factor model with both mediators loading on one common factor (χ2 [34] = 502.21, p < 0.001, CFI = 0.75, TLI = 0.67, RMSEA = 0.20; Δχ2 [2] = 446.20, p < 0.001), as well as a single-factor model (χ2 [35] = 694.50, p < 0.001, CFI = 0.65, TLI = 0.55, RMSEA = 0.24; Δχ2 [3] = 638.49, p < 0.001). Additionally, we tested the convergent and discriminant validity of the mechanism and outcome measures. The standardized loading estimates and average variance extracted (AVE) estimates of each construct exceeded 0.50, indicating convergent validity (Hair et al., 2010), and the AVE estimates were larger than the shared variance with any other construct, supporting discriminant validity (Fornell & Larcker, 1981).
Controls
We included outcome favorability as a control variable. Outcome favorability refers to whether applicants received a job offer from the organization (Wilhelmy et al., 2018). It is an important determinant of applicants’ perceptions of and attitudes toward an organization after they participate in its selection process (Hausknecht et al., 2004; Ryan & Ployhart, 2000). We coded outcome favorability as 0 for no job offer and 1 for a job offer.
Additionally, we collected information about age, gender, and educational achievement. However, in line with previous research in the applicant reactions literature (e.g., Ababneh et al., 2014; Bauer et al., 2006), we do not report these variables in our regression analyses, as they are uncorrelated with the dependent variables (see Table 4). An inclusion may reduce power or increase the possibility of type 1 errors, which suggests that there are effects when in reality there are none (Aguinis & Vandenberg, 2014). However, analyses that include these control variables revealed similar patterns of results.
Results
Table 4 presents the means, standard deviations, and correlations among variables used in Study 2. We also examined means, standard deviations, and correlations for the relationships of each of the 21 digital and nondigital selection methods and innovativeness, as well as procedural justice, to provide a more in-depth picture (see Table 9 Appendix 30). However, due to the small sample sizes for most digital selection methods and the fact that applicants’ perceptions are based on the entire selection process, we conducted the analyses and interpreted the results only with the aggregated measures of digital selection processes.
Following the same analytical steps as Study 1, we first examined whether the share of digital selection methods has a positive direct effect on innovativeness perceptions (Hypothesis 1) and a negative direct effect on procedural justice perceptions (Hypothesis 2). Table 5 shows the regression results for these direct effects. Due to the variation in the measurement scales of the independent and dependent variables in this study, we report standardized estimates.
We found support for Hypothesis 1, which anticipates a positive relationship between the share of digital selection methods and the innovativeness perceptions of applicants (β = 0.35, p < 0.001). Hypothesis 2 anticipates a negative relationship between the share of digital selection methods and the procedural justice perceptions of applicants. Despite a significant negative correlation between the share of digital selection methods and applicants’ procedural justice perceptions (r = − 0.12, p = 0.029), we did not find support for Hypothesis 2, as this relation diminished in the regression analysis (β = − 0.06, p = 0.251).
Next, we examined indirect effects in the relationship between the share of digital selection methods and employer attractiveness via perceptions of innovativeness (Hypothesis 3) and procedural justice (Hypothesis 4). Table 6 shows the regression results for the tests of the indirect effects.
In line with the positive relation between the share of digital methods and innovativeness, we found an indirect effect of the share of digital methods on employer attractiveness via innovativeness that supported Hypothesis 3 (a × b = 0.06, 95% CI [0.015, 0.102]). In line with the nonsignificant relation between the share of digital methods and procedural justice, we did not find a significant indirect effect via procedural justice (a × b = − 0.02, 95% CI [− 0.064, 0.017]) in support of Hypothesis 4. The total indirect effect was not significant (a × b = 0.03, 95% CI [− 0.024, 0.094]) and does not constitute a partial or full mediation of the overall effect of the share of digital selection methods on employer attractiveness (c = − 0.02, p = 0.691, c′ = − 0.06, p = 0.304).
Discussion
The results of Study 2 show that the share of digital methods in the selection process was related to innovativeness such that greater usage of digital methods in the selection process was associated with higher innovativeness perceptions, which indirectly affected the organization’s attractiveness to applicants. In contrast, the share of digital selection methods did not affect applicants’ procedural justice perceptions in Study 2.
One explanation of why applicants did not use the degree of digitalization of selection methods to draw inferences about fairness when surveyed retrospectively might stem from the operationalization of the independent variable as an overall share of digital selection methods used in the selection process in Study 2. While Study 1 showed significant differences in fairness perceptions for the interview stage, the nonsignificant effect in Study 2 could be explained by the consideration of the overall selection process (including the application and screening stage, as well as the assessment stage).
Another explanation might be that applicants had collected more information throughout the selection process that allowed them to make a more detailed assessment of the organization and its selection procedures. Before submitting an application, applicants base their impressions regarding the fairness of an organization on readily available information that the organization conveys to applicants (Celani & Singh, 2011). Throughout the selection process, however, applicants gain more information about the organization from further interactions (Klotz et al., 2013), which might decrease their uncertainty regarding the procedural fairness of the selection process and their perception of an organization’s attractiveness and explain why digital methods are no longer used as proxies to form these perceptions. In this regard, our findings are generally in line with the results of Uggerslev et al.’s (2012) meta-analysis, which shows that applicants’ perceptions of selection process characteristics and the impact of those characteristics on applicant attraction vary according to the applicant’s current stage in the recruitment and selection process.
Overall Discussion
We uncovered the signaling mechanisms of digital selection methods by examining their effects on applicants’ employer attractiveness perceptions via innovativeness and procedural justice by performing an experimental vignette experiment on potential applicants and a field study on actual job applicants. With the exception of the assessment stage in Study 1, both studies supported perceptions of innovativeness as a mechanism behind positive signals of digital selection methods on applicants’ employer attractiveness judgments. In contrast, procedural justice perceptions were supported as mechanisms of negative signals only in the interview stage in the vignette experiment involving potential applicants but not in the retrospective field study that involved actual job applicants.
Theoretical Implications
This research contributes to the applicant reactions literature (Chapman et al., 2005; Gilliland, 1993; Hausknecht et al., 2004; Smither et al., 1993; Uggerslev et al., 2012) in three ways. First, previous research has repeatedly called on scholars to keep up with technological developments in personnel selection (McCarthy et al., 2017; Ployhart, 2006; Ryan & Ployhart, 2014). We responded to these calls by specifically investigating the attitudes of potential applicants toward new forms of selection methods with high degrees of digitalization in comparison to traditional selection methods with low degrees of digitalization. In investigating applicants’ perceptions of digital methods in all three typical stages of the selection process in the experimental vignette study, we also extended the current state of research, which has mainly focused on applicant reactions to selection methods in only one phase (see McCarthy et al., 2017; Stone et al., 2013).
Second, we answered recent calls to expand the theoretical lens of the applicant reactions literature by drawing on signaling theory (McCarthy et al., 2017). While previous research in the field of applicant reactions to selection procedures mainly focused on Gilliland’s (1993) framework based on organizational justice theory, we broadened the scope of the examination by demonstrating effects on perceptions of innovativeness. Independent of these specific dimensions, the mere expansion of applicants’ perceptions provides support for the incorporation of signaling theory as a necessary theoretical extension. In this vein, our findings suggest that applicants make inferences based on signals that they receive from an organization’s utilization of digital methods.
Third, by broadening the theoretical lens of the applicant reactions literature (Hausknecht et al., 2004; McCarthy et al., 2017; Ryan & Ployhart, 2000), we demonstrate that potential applicants use symbolic attributes, specifically their perceptions of the innovativeness of the organization’s selection methods (Lievens & Highhouse, 2003), to make inferences about a potential employer’s attractiveness, which is an essential outcome variable in the applicant reactions literature (Hausknecht et al., 2004; McCarthy et al., 2017). While previous studies have indicated that innovativeness perceptions predict the organizational attractiveness perceptions of applicants (Lievens & Highhouse, 2003; Slaughter et al., 2004), we lacked an understanding of the drivers of innovativeness perceptions. By identifying digital selection methods a key driver, we extend the previous research in the applicant reactions literature, which has mainly focused on situationally based (e.g., fairness of the procedures) and dispositionally based (e.g., anxiety or motivation of applicants) perceptions of selection procedures (McCarthy et al., 2017).
Limitations and Future Research
Even though we applied two methods that combine the advantages of internal validity (experimental vignette study) and generalizability (field study), there are limitations that we recommend future research address. First, the results were not fully consistent, as a negative signal on procedural justice was revealed in Study 1 but not in Study 2. Although this difference may be explained by the different samples—we examined perceptions of potential applicants in Study 1 but those of actual applicants who retrospectively reflected on previous selection processes in Study 2—and different operationalizations of the independent variables, additional research is needed to confirm this difference. In particular, we do not know how applicants’ innovativeness and procedural justice perceptions of digital selection methods might change over time. Therefore, additional research would benefit from longitudinal studies that investigate changes in applicants’ perceptions of digital selection methods through the various stages of the selection process (see Barber, 1998).
Second, our sample in Study 2 is characterized by a large portion of participants accepting a job offer after the selection process they reported. Consequently, even though we asked participants to indicate their perceptions of innovativeness, procedural justice, and employer attractiveness they had directly after participating in the selection process, we cannot rule out that working at the organization influenced these perceptions. Hence, future research might benefit from replicating Study 2 with a sample that comprises a larger portion of applicants who did not receive or accept a job offer.
Third, even though the results of our prestudy underscore applicants’ concerns with regard to the degree of digitalization of different selection methods, we cannot determine which part of each selection method is actually driving the effects of digitalization on innovativeness and procedural justice perceptions. Future research might investigate whether these differences are due to the digitalization of the method itself (e.g., chatbot interview vs. face-to-face interview) or the evaluation system (e.g., algorithmic decision making vs. human evaluator).
Fourth, while we derived mechanisms from two different theoretical perspectives, future research might examine whether our conceptual model should be extended by integrating other mechanisms that might influence the relationship between the utilization of digital selection methods and employer attractiveness assessments. As our research shows, innovativeness, as one symbolic attribute (Lievens & Highhouse, 2003), is a useful signaling mechanism that links the utilization of digital selection methods and employer attractiveness. Future research might investigate whether other symbolic attributes, such as cheerfulness or sincerity (Lievens, 2007), influence the relationship between digital selection methods and outcomes such as employer attractiveness. Moreover, in addition to the perceptions of procedural justice, applicants’ perceptions of distributive justice of selection methods with different degrees of digitalization may differ and, as a result, indirectly affect employer attractiveness. Many digital selection methods, such as different forms of online assessments, provide the opportunity to give instant feedback to applicants on their performance and the outcome.Footnote 2 Hence, future research might investigate whether this instant feedback has a positive effect on distributive justice perceptions and consequently on employer attractiveness perceptions of applicants.
Fifth, similar to previous studies in the applicant reactions literature (e.g., Bauer et al., 2006), we were mainly interested in applicants’ perceptions of the overall procedural justice of selection methods. However, the overall procedural justice construct covers eleven rules (Bauer et al., 2001), which capture the three key dimensions of “perceived job relatedness,” “opportunity to perform,” and “interpersonal treatment” (O’Leary et al., 2017). Future research might investigate applicants’ perceptions of these subdimensions when organizations use selection methods with high degrees of digitalization.
Sixth, the results from Study 1 reveal that potential applicants do not perceive highly digital online-based work sample tests as more innovative than offline work sample tests, which might be explained by the fact that many organizations already frequently use these digital methods of assessment. However, this potential explanation is not supported by empirical evidence. Hence, future research might investigate whether the widespread use of digital selection methods and technology in general moderates applicants’ perceptions of innovativeness and procedural justice.
Finally, by operationalizing the independent variable in Study 2 as an overall share of selection methods used in a selection process, we were able to capture a realistic picture of real-world selection processes in which participants participate in varying selection processes and therefore experience different degrees of digitalization per stage and per process. However, with this operationalization, we were not able to investigate applicants’ reactions in the context of the presence versus the absence of digital selection methods, as we did in Study 1. Hence, future research might benefit from replicating our research in a real-world setting by operationalizing the independent variable as a dummy variable with two levels representing the presence and absence of digital selection methods for each stage.
Practical Implications
While organizations save time and money by using digital technologies in their personnel selection processes (McCarthy et al., 2017), these technologies also shape perceptions of the organization among applicants. Our research clarifies previous results that show that these perceptions can be both positive and negative by shedding light on the specific signals that are sent on innovation and procedural justice. When organizations know which signals they are sending by using different selection methods, they can proactively adapt their recruitment communications (Wilhelmy et al., 2017). In this vein, the identification of these signals allows us to provide concrete recommendations for organizations and particularly for human resource managers who aim to keep up with the latest technologies in their selection processes.
Our results demonstrate that potential applicants and applicants who have already gone through a selection process perceive the utilization of digital technologies in selection processes to be innovative. As applicants might also express their impressions of the selection process to others (Smither et al., 1993), innovativeness perceptions can enhance an organization’s overall reputation and employer image (Cable & Turban, 2003; Highhouse et al., 1999). This broad image, which can be built through word of mouth (van Hoye & Lievens, 2009), can help organizations attract and retain potential employees (Backhaus & Tikoo, 2004). Hence, by using digital technologies in the personnel selection process, organizations can underscore their innovativeness, which can in turn help them win the race for high-potential candidates in a highly competitive labor market (Sommer et al., 2017).
However, as we also identified a negative effect of the signal provided by digital technologies on the perceptions of potential applicants, organizations must take great care in selecting and implementing digital technologies in their selection processes. Otherwise, organizations may forgo attracting the best talent as potential applicants who are the very target of selection processes might be discouraged from applying. Specifically, organizations should address potential concerns regarding procedural justice. Applicants might perceive digital interview methods as less fair than nondigital methods because they have the impression that digital methods cannot provide sufficient information (Dineen et al., 2004). Furthermore, many digital selection technologies are based on machine learning and algorithmic decision making, which can contain biases, such as those related to race or gender (Caliskan et al., 2017), an issue that is also discussed intensively in the public debate (Dastin, 2018). Therefore, applicants might be concerned about their ability to make a positive impression when digital interview methods are applied (Stone-Romero et al., 2003). Organizations could address this issue by clearly communicating which information is needed from applicants and used to make their selection decisions. However, organizations should also note that only describing how the selection method works does not necessarily lead to higher fairness perceptions (Langer et al., 2018). Furthermore, organizations should make sure to communicate openly to applicants that any information that is collected and digitally stored through video interviews is not used for other purposes. These measures address concerns about procedural justice by highlighting how participants can provide all necessary information about themselves in the selection process and simultaneously reduce potential data privacy concerns (Bauer et al., 2006).
In sum, while reaping the positive effects of digital selection methods on innovativeness perceptions, organizations can and should address issues about procedural justice in multiple ways in their communication efforts to improve potential applicants’ attitudes toward the organization.
Appendix 1
Appendix 2
Appendix 3
Notes
We thank one of our anonymous reviewers for this remark.
We thank one of our anonymous reviewers for this remark.
References
Ababneh, K. I., Hackett, R. D., & Schat, A. C. H. (2014). The role of attributions and fairness in understanding job applicant reactions to selection procedures and decisions. Journal of Business and Psychology, 29(1), 111–129. https://doi.org/10.1007/s10869-013-9304-y
Acikgoz, Y., Davison, K. H., Compagnone, M., & Laske, M. (2020). Justice perceptions of artificial intelligence in selection. International Journal of Selection and Assessment, 28(4), 399–416. https://doi.org/10.1111/ijsa.12306
Aguinis, H., & Bradley, K. J. (2014). Best practice recommendations for designing and implementing experimental vignette methodology studies. Organizational Research Methods, 17(4), 351–371. https://doi.org/10.1177/1094428114547952
Aguinis, H., & Vandenberg, R. J. (2014). An ounce of prevention is worth a pound of cure: Improving research quality before data collection. Annual Review of Organizational Psychology and Organizational Behavior, 1(1), 569–595. https://doi.org/10.1146/annurev-orgpsych-031413-091231
Anderson, N. (2003). Applicant and recruiter reactions to new technology in selection: A critical review and agenda for future research. International Journal of Selection and Assessment, 11(2–3), 121–136. https://doi.org/10.1111/1468-2389.00235
Anderson, C. A., Lindsay, J. J., & Bushman, B. J. (1999). Research in the psychological laboratory: Truth or triviality? Current Directions in Psychological Science, 8(1), 3–9. https://doi.org/10.1111/1467-8721.00002
Aquino, K., Tripp, T. M., & Bies, R. J. (2001). How employees respond to personal offense: The effects of blame attribution, victim status, and offender status on revenge and reconciliation in the workplace. Journal of Applied Psychology, 86(1), 52–59. https://doi.org/10.1037/0021-9010.86.1.52
Aquino, K., Tripp, T. M., & Bies, R. J. (2006). Getting even or moving on? Power, procedural justice, and types of offense as predictors of revenge, forgiveness, reconciliation, and avoidance in organizations. Journal of Applied Psychology, 91(3), 653–668. https://doi.org/10.1037/0021-9010.91.3.653
Armstrong, M. B., Ferrell, J. Z., Collmus, A. B., & Landers, R. N. (2016). Correcting misconceptions about gamification of assessment: More than SJTs and badges. Industrial and Organizational Psychology, 9(3), 671–677. https://doi.org/10.1017/iop.2016.69
Atzmüller, C., & Steiner, P. M. (2010). Experimental vignette studies in survey research. Methodology, 6(3), 128–138. https://doi.org/10.1027/1614-2241/a000014
Backhaus, K., & Tikoo, S. (2004). Conceptualizing and researching employer branding. Career Development International, 9(5), 501–517. https://doi.org/10.1108/13620430410550754
Bangerter, A., Roulin, N., & König, C. J. (2012). Personnel selection as a signaling game. Journal of Applied Psychology, 97(4), 719–738. https://doi.org/10.1037/a0026078
Barber, A. E. (1998). Recruiting employees: Individual and organizational perspectives (Vol. 8). Sage Publications.
Basch JM, Melchers KG, Kurz A, Krieger M, Miller L (2020) It takes more than a good camera: Which factors contribute to differences between face-to-face interviews and videoconference interviews regarding performance ratings and interviewee perceptions? J Bus Psychol 1–20.https://doi.org/10.1007/s10869-020-09714-3
Bauer, T. N., Maertz, C. P., & JR., Dolen, M. R., & Campion, M. A. . (1998). Longitudinal assessment of applicant reactions to employment testing and test outcome feedback. Journal of Applied Psychology, 83(6), 892–903. https://doi.org/10.1037/0021-9010.83.6.892
Bauer, T. N., Truxillo, D. M., Sanchez, R. J., Craig, J. M., Ferrara, P., & Campion, M. A. (2001). Applicant reactions to selection: Development of the selection procedural justice scale (SPJS). Personnel Psychology, 54(2), 387–419. https://doi.org/10.1111/j.1744-6570.2001.tb00097.x
Bauer, T. N., Truxillo, D. M., Paronto, M. E., Weekley, J. A., & Campion, M. A. (2004). Applicant reactions to different selection technology: Face-to-face, interactive voice response, and computer-assisted telephone screening Interviews. International Journal of Selection and Assessment, 12(1–2), 135–148. https://doi.org/10.1111/j.0965-075X.2004.00269.x
Bauer, T. N., Truxillo, D. M., Tucker, J. S., Weathers, V., Bertolino, M., Erdogan, B., & Campion, M. A. (2006). Selection in the information age: The impact of privacy concerns and computer experience on applicant reactions. Journal of Management, 32(5), 601–621. https://doi.org/10.1177/0149206306289829
Blacksmith, N., Willford, J., & Behrend, T. (2016). Technology in the employment interview: A meta-analysis and future research agenda. Personal Assessment and Decisions, 2(1). https://doi.org/10.25035/pad.2016.002
Breaugh, J. A. (2008). Employee recruitment: Current knowledge and important areas for future research. Human Resource Management Review, 18(3), 103–118. https://doi.org/10.1016/j.hrmr.2008.07.003
Breaugh, J. A. (2013). Employee recruitment. Annual Review of Psychology, 64, 389–416. https://doi.org/10.1146/annurev-psych-113011-143757
Brislin, R. W. (1970). Back-translation for cross-cultural research. Journal of Cross-Cultural Psychology, 1(3), 185–216. https://doi.org/10.1177/135910457000100301
Buil, I., Catalán, S., & Martínez, E. (2020). Understanding applicants’ reactions to gamified recruitment. Journal of Business Research, 110, 41–50. https://doi.org/10.1016/j.jbusres.2019.12.041
Cable, D. M., & Turban, D. B. (2003). The value of organizational reputation in the recruitment context: A brand-equity perspective. Journal of Applied Social Psychology, 33(11), 2244–2266. https://doi.org/10.1111/j.1559-1816.2003.tb01883.x
Caliskan, A., Bryson, J. J., & Narayanan, A. (2017). Semantics derived automatically from language corpora contain human-like biases. Science, 356(6334), 183–186. https://doi.org/10.1126/science.aal4230
Celani, A., & Singh, P. (2011). Signaling theory and applicant attraction outcomes. Personnel Review, 40(2), 222–238. https://doi.org/10.1108/00483481111106093
Chapman, D. S., & Webster, J. (2001). Rater correction processes in applicant selection using videoconference technology: The role of attributions. Journal of Applied Social Psychology, 31(12), 2518–2537. https://doi.org/10.1111/j.1559-1816.2001.tb00188.x
Chapman, D. S., Uggerslev, K. L., & Webster, J. (2003). Applicant reactions to face-to-face and technology-mediated interviews: A field investigation. Journal of Applied Psychology, 88(5), 944–953. https://doi.org/10.1037/0021-9010.88.5.944
Chapman, D. S., Uggerslev, K. L., Carroll, S. A., Piasentin, K. A., & Jones, D. A. (2005). Applicant attraction to organizations and job choice: A meta-analytic review of the correlates of recruiting outcomes. Journal of Applied Psychology, 90(5), 928–944. https://doi.org/10.1037/0021-9010.90.5.928
Cheng, M. M., & Hackett, R. D. (2021). A critical review of algorithms in HRM: Definition, theory, and practice. Human Resource Management Review, 31(1), 100698. https://doi.org/10.1016/j.hrmr.2019.100698
Colquitt, J. A., Conlon, D. E., Wesson, M. J., Porter, C. O. L. H., & Ng, K. Y. (2001). Justice at the millenium: A meta-analytic review of 25 years of organizational justice research. Journal of Applied Psychology, 86(3), 425–445. https://doi.org/10.1037//0021-9010.86.3.425
Connelly, B. L., Certo, S. T., Ireland, R. D., & Reutzel, C. R. (2011). Signaling theory: A review and assessment. Journal of Management, 37(1), 39–67. https://doi.org/10.1177/0149206310388419
Dastin, J. (2018). Amazon scraps secret AI recruiting tool that showed bias against women. Retrieved from https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
Dineen, B. R., Noe, R. A., & Wang, C. (2004). Perceived fairness of web-based applicant screening procedures: Weighing the rules of justice and the role of individual differences. Human Resource Management, 43(2–3), 127–145. https://doi.org/10.1002/hrm.20011
Dülmer, H. (2016). The factorial survey: Design selection and its impact on reliability and internal validity. Sociological Methods & Research, 45(2), 304–347. https://doi.org/10.1177/0049124115582269
Ehrhart, K. H., & Ziegert, J. C. (2005). Why are individuals attracted to organizations? Journal of Management, 31(6), 901–919. https://doi.org/10.1177/0149206305279759
Flanagan, J. C. (1954). The critical incident technique. Psychological Bulletin, 51(4), 327–358. https://doi.org/10.1037/h0061470
Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18(1), 39–50. https://doi.org/10.2307/3151312
Gager, S., Sittig, A., & Batty, R. (2015). 2015 talent trends: Insights for the modern recruiter on what talent wants around the world. Retrieved from LinkedIn Talent Solutions website: https://business.linkedin.com/content/dam/business/talent-solutions/global/en_us/c/pdfs/global-talent-trends-report.pdf
Gilliland, S. W. (1993). The perceived fairness of selection systems: An organizational justice perspective. Academy of Management Review, 18(4), 694–734. https://doi.org/10.5465/AMR.1993.9402210155
Gilliland, S. W. (1994). Effects of procedural and distributive justice on reactions to a selection system. Journal of Applied Psychology, 79(5), 691–701. https://doi.org/10.1037/0021-9010.79.5.691
Hair, J. F., Jr., Black, W. C., Babin, B. J., & Anderson, R. E. (2010). Multivariate data analysis (7. ed.). Upper Saddle River, NJ: Pearson Prentice Hall: Pearson Prentice Hall.
Harold, C. M., Holtz, B. C., Griepentrog, B. K., Brewer, L. M., & Marsh, S. M. (2016). Investigating the effects of applicant justice perceptions on job offer acceptance. Personnel Psychology, 69(1), 199–227. https://doi.org/10.1111/peps.12101
Harris, M. M., Van Hoye, G., & Lievens, F. (2003). Privacy and attitudes towards internet-based selection systems: A cross-cultural comparison. International Journal of Selection and Assessment, 11(2–3), 230–236. https://doi.org/10.1111/1468-2389.00246.
Hartwell, C. J., & Campion, M. A. (2020). Getting social in selection: How social networking website content is perceived and used in hiring. International Journal of Selection and Assessment, 28(1), 1–16. https://doi.org/10.1111/ijsa.12273
Hausknecht, J. P., Day, D. V., & Thomas, S. C. (2004). Applicant reactions to selection procedures: An updated model and meta-analysis. Personnel Psychology, 57(3), 639–683. https://doi.org/10.1111/j.1744-6570.2004.00003.x
Hayes, A. F. (2018). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach (Second edition). Methodology in the social sciences. New York, London: The Guilford Press: The Guilford Press.
Herman, R. E., & Gioia, J. L. (2000). How to become an employer of choice. Winchester, VA: Oakhill Press: Oakhill Press.
Hiemstra, A. M. F., Derous, E., Serlie, A. W., & Born, M. P. (2012). Fairness perceptions of video resumes among ethnically diverse applicants. International Journal of Selection and Assessment, 20(4), 423–433. https://doi.org/10.1111/ijsa.12005
Hiemstra, A. M. F., Oostrom, J. K., Derous, E., Serlie, A. W., & Born, M. P. (2019). Applicant perceptions of initial job candidate screening with asynchronous job interviews. Journal of Personnel Psychology, 18(3), 138–147. https://doi.org/10.1027/1866-5888/a000230
Highhouse, S., Zickar, M. J., Thorsteinson, T. J., Stierwalt, S. L., & Slaughter, J. E. (1999). Assessing company employment image: An example in the fast food industry. Personnel Psychology, 52(1), 151–172. https://doi.org/10.1111/j.1744-6570.1999.tb01819.x
Highhouse, S., Lievens, F., & Sinar, E. F. (2003). Measuring attraction to organizations. Educational and Psychological Measurement, 63(6), 986–1001. https://doi.org/10.1177/0013164403258403
Ingold, P. V., & Langer, M. (2021). Resume = Resume? The effects of blockchain, social media, and classical resumes on resume fraud and applicant reactions to resumes. Computers in Human Behavior, 114, 106573. https://doi.org/10.1016/j.chb.2020.106573
Jones, D. A., Willness, C. R., & Madey, S. (2014). Why are job seekers attracted by corporate social performance? Experimental and field tests of three signal-based mechanisms. Academy of Management Journal, 57(2), 383–404. https://doi.org/10.5465/amj.2011.0848
Klotz, A. C., da Motta Veiga, S. P., Buckley, M. R., & Gavin, M. B. (2013). The role of trustworthiness in recruitment and selection: A review and guide for future research. Journal of Organizational Behavior, 34(S1), S104–S119. https://doi.org/10.1002/job.1891
Konradt, U., Oldeweme, M., Krys, S., & Otte, K.-P. (2020). A meta-analysis of change in applicants’ perceptions of fairness. International Journal of Selection and Assessment, 28(4), 365–382. https://doi.org/10.1111/ijsa.12305
Landers, R. N., & Marin, S. (2021). Theory and technology in organizational psychology: A review of technology integration paradigms and their effects on the validity of theory. Annual Review of Organizational Psychology and Organizational Behavior, 8(1), 235–258. https://doi.org/10.1146/annurev-orgpsych-012420-060843
Langer, M., König, C. J., & Krause, K. (2017). Examining digital interviews for personnel selection: Applicant reactions and interviewer ratings. International Journal of Selection and Assessment, 25(4), 371–382. https://doi.org/10.1111/ijsa.12191
Langer, M., König, C. J., & Fitili, A. (2018). Information as a double-edged sword: The role of computer experience and information on applicant reactions towards novel technologies for personnel selection. Computers in Human Behavior, 81, 19–30. https://doi.org/10.1016/j.chb.2017.11.036
Langer, M., König, C. J., & Papathanasiou, M. (2019). Highly automated job interviews: Acceptance under the influence of stakes. International Journal of Selection and Assessment, 27(3), 217–234. https://doi.org/10.1111/ijsa.12246
Leventhal, G. S. (1980). What should be done with equity theory? Springer US: Springer US. https://doi.org/10.1007/978-1-4613-3087-5_2
Lievens, F. (2007). Employer branding in the Belgian Army: The importance of instrumental and symbolic beliefs for potential applicants, actual applicants, and military employees. Human Resource Management, 46(1), 51–69. https://doi.org/10.1002/hrm.20145
Lievens, F., & Highhouse, S. (2003). The relation of instrumental and symbolic attributes to a company’s attractiveness as an employer. Personnel Psychology, 56(1), 75–102. https://doi.org/10.1111/j.1744-6570.2003.tb00144.x
Lukacik, E.-R., Bourdage, J. S., & Roulin, N. (2020). Into the void: A conceptual model and research agenda for the design and use of asynchronous video interviews. Human Resource Management Review, 100789.https://doi.org/10.1016/j.hrmr.2020.100789
Macan, T. H., Avedon, M. J., Paese, M., & Smith, D. E. (1994). The effects of applicants’ reactions to cognitive ability tests and assessment. Personnel Psychology, 47(4), 715–738. https://doi.org/10.1111/j.1744-6570.1994.tb01573.x
MacKinnon, D. P., Coxe, S., & Baraldi, A. N. (2012). Guidelines for the investigation of mediating variables in business research. Journal of Business and Psychology, 27(1), 1–14. https://doi.org/10.1007/s10869-011-9248-z
McCarthy, J. M., Bauer, T. N., Truxillo, D. M., Anderson, N. R., Costa, A. C., & Ahmed, S. M. (2017). Applicant perspectives during selection: A review addressing “so what?”, “what’s new?”, and “where to next?” Journal of Management, 43(6), 1693–1725. https://doi.org/10.1177/0149206316681846
Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17(3), 437–455. https://doi.org/10.1037/a0028085
Moran, G. (2018). How to nail an interview with a chatbot. Retrieved from https://www.fastcompany.com/90216307/how-to-nail-an-interview-with-a-chat-bot
Moss, T. W., Neubaum, D. O., & Meyskens, M. (2015). The effect of virtuous and entrepreneurial orientations on microfinance lending and repayment: A signaling theory perspective. Entrepreneurship Theory and Practice, 39(1), 27–52. https://doi.org/10.1111/etap.12110
Nikolaou, I., Georgiou, K., Bauer, T. N., & Truxillo, D. M. (2019). Applicant reactions in employee recruitment and selection. In R. N. Landers (Ed.), The Cambridge Handbook of Technology and Employee Behavior (pp. 100–130). Cambridge University Press. https://doi.org/10.1017/9781108649636.006
Nørskov, S., Damholdt, M. F., Ulhøi, J. P., Jensen, M. B., Ess, C., & Seibt, J. (2020). Applicant fairness perceptions of a robot-mediated job interview: A video vignette-based experimental survey. Frontiers in Robotics and AI, 7, 586263. https://doi.org/10.3389/frobt.2020.586263
O’Leary, R. S., Forsman, J. W., & Isaacson, J. A. (2017). The role of simulation exercises in selection. In H. W. Goldstein, E. D. Pulakos, J. Passmore, & C. Semedo (Eds.), Wiley Blackwell handbooks in organizational psychology. The Wiley Blackwell handbook of the psychology of recruitment, selection and employee retention (pp. 247–270). Chichester, West Sussex: Wiley Blackwell.
Parasuraman, A. (2000). Technology readiness index (TRI): A multiple-item scale to measure readiness to embrace new technologies. Journal of Service Research, 2(4), 307–320. https://doi.org/10.1177/109467050024001
Piao, M. (2010). Thriving in the new: Implication of exploration on organizational longevity. Journal of Management, 36(6), 1529–1554. https://doi.org/10.1177/0149206310378367
Ployhart, R. E. (2006). Staffing in the 21st century: New challenges and strategic opportunities. Journal of Management, 32(6), 868–897. https://doi.org/10.1177/0149206306293625
Ployhart, R. E., Ryan, A. M., & Bennett, M. (1999). Explanations for selection decisions: Applicants’ reactions to informational and sensitivity features of explanations. Journal of Applied Psychology, 84(1), 87–106. https://doi.org/10.1037/0021-9010.84.1.87
Potosky, D., & Bobko, P. (2004). Selection testing via the internet: Practical considerations and exploratory empirical findings. Personnel Psychology, 57(4), 1003–1034. https://doi.org/10.1111/j.1744-6570.2004.00013.x
Preacher, K. J., & Hayes, A. F. (2008). Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models. Behavior Research Methods, 40(3), 879–891. https://doi.org/10.3758/BRM.40.3.879
Roth, P. L., Bobko, P., & McFarland, L. A. (2005). A meta-analysis of work sample test validity: Updating and integrating some classic literature. Personnel Psychology, 58(4), 1009–1037. https://doi.org/10.1111/J.1744-6570.2005.00714.X
Roulin, N., & Bangerter, A. (2013). Social networking websites in personnel selection: A signaling perspective on recruiters’ and applicants’ perceptions. Journal of Personnel Psychology, 12(3), 143–151. https://doi.org/10.1027/1866-5888/a000094
Ryan, A. M., & Ployhart, R. E. (2000). Applicants’ perceptions of selection procedures and decisions: A critical review and agenda for the future. Journal of Management, 26(3), 565–606. https://doi.org/10.1016/S0149-2063(00)00041-6
Ryan, A. M., & Ployhart, R. E. (2014). A century of selection. Annual Review of Psychology, 65, 693–717. https://doi.org/10.1146/annurev-psych-010213-115134
Ryan, A. M., Sacco, J. M., McFarland, L. A., & Kriska, S. D. (2000). Applicant self-selection: Correlates of withdrawal from a multiple hurdle process. Journal of Applied Psychology, 85(2), 163–179.
Ryan, A. M., Inceoglu, I., Bartram, D., Golubovich, J., Grand, J., Reeder, M., . . . Yao, X. (2015). Trends in testing: Highlights of a global survey. In I. Nikolaou & J. K. Oostrom (Eds.), Current issues in work and organizational psychology. Employee recruitment, selection, and assessment: Contemporary issues for theory and practice (1st ed., pp. 148–165). London u.a.: Psychology Press.
Rynes, S. L., Bretz, R. D., & Gerhart, B. (1991). The importance of recruitment in job choice: A different way of looking. Personnel Psychology, 44(3), 487–521.
Slaughter, J. E., & Greguras, G. J. (2009). Initial attraction to organizations: The influence of trait inferences. International Journal of Selection and Assessment, 17(1), 1–18. https://doi.org/10.1111/j.1468-2389.2009.00447.x
Slaughter, J. E., Zickar, M. J., Highhouse, S., & Mohr, D. C. (2004). Personality trait inferences about organizations: Development of a measure and assessment of construct validity. Journal of Applied Psychology, 89(1), 85–103. https://doi.org/10.1037/0021-9010.89.1.85
Smither, J. W., Reilly, R. R., Millsap, R. E., Pearlman, K., & Stoffey, R. W. (1993). Applicant reactions to selection procedures. Personnel Psychology, 46(1), 49–76. https://doi.org/10.1111/j.1744-6570.1993.tb00867.x
Sommer, L. P., Heidenreich, S., & Handrich, M. (2017). War for talents: How perceived organizational innovativeness affects employer attractiveness. R&D Management, 47(2), 299–310. https://doi.org/10.1111/radm.12230
Spar, B., Pletenyuk, I., Reilly, K., & Ignatova, M. (2018). Global recruiting trends 2018: The 4 ideas changing how you hire. Retrieved from LinkedIn Talent Solutions website: https://news.linkedin.com/2018/1/global-recruiting-trends-2018
Spence, M. (1973). Job market signaling. The Quarterly Journal of Economics, 87(3), 355. https://doi.org/10.2307/1882010
Stachl, C., Boyd, R. L., Horstmann, K. T., Khambatta, P., Matz, S., & Harari, G. M. (2021). Computational personality assessment - An overview and perspective. https://doi.org/10.31234/osf.io/ck2bj
Steiner, D. D., & Gilliland, S. W. (1996). Fairness reactions to personnel selection techniques in France and the United States. Journal of Applied Psychology, 81(2), 134–141. https://doi.org/10.1037/0021-9010.81.2.134
Stone, D. L., Lukaszewski, K. M., Stone-Romero, E. F., & Johnson, T. L. (2013). Factors affecting the effectiveness and acceptance of electronic selection systems. Human Resource Management Review, 23(1), 50–70. https://doi.org/10.1016/j.hrmr.2012.06.006
Stone, D. L., Deadrick, D. L., Lukaszewski, K. M., & Johnson, R. (2015). The influence of technology on the future of human resource management. Human Resource Management Review, 25(2), 216–231. https://doi.org/10.1016/j.hrmr.2015.01.002
Stone-Romero, E. F., Stone, D. L., & Hyatt, D. (2003). Personnel selection procedures and invasion of privacy. Journal of Social Issues, 59(2), 343–368. https://doi.org/10.1111/1540-4560.00068
Stoughton, J. W., Thompson, L. F., & Meade, A. W. (2015). Examining applicant reactions to the use of social networking websites in pre-employment screening. Journal of Business and Psychology, 30(1), 73–88. https://doi.org/10.1007/s10869-013-9333-6
Straus, S. G., Miles, J. A., & Levesque, L. L. (2001). The effects of videoconference, telephone, and face-to-face media on interviewer and applicant judgments in employment interviews. Journal of Management, 27(3), 363–381. https://doi.org/10.1177/014920630102700308
Suazo, M. M., Martínez, P. G., & Sandoval, R. (2009). Creating psychological and legal contracts through human resource practices: A signaling theory perspective. Human Resource Management Review, 19(2), 154–166. https://doi.org/10.1016/j.hrmr.2008.11.002
Suen, H.-Y., Chen, M.Y.-C., & Lu, S.-H. (2019). Does the use of synchrony and artificial intelligence in video interviews affect interview ratings and applicant attitudes? Computers in Human Behavior, 98, 93–101. https://doi.org/10.1016/j.chb.2019.04.012
Tews, M. J., Stafford, K., & Kudler, E. P. (2020). The effects of negative content in social networking profiles on perceptions of employment suitability. International Journal of Selection and Assessment, 28(1), 17–30. https://doi.org/10.1111/ijsa.12277
Tippins, N. T. (2015). Technology and assessment in selection. Annual Review of Organizational Psychology and Organizational Behavior, 2(1), 551–582. https://doi.org/10.1146/annurev-orgpsych-031413-091317
Tsai, W.-C., & Yang, I.W.-F. (2010). Does image matter to different job applicants? The influences of corporate image and applicant individual differences on organizational attractiveness. International Journal of Selection and Assessment, 18(1), 48–63. https://doi.org/10.1111/j.1468-2389.2010.00488.x
Turban, D. B. (2001). Organizational attractiveness as an employer on college campuses: An examination of the applicant population. Journal of Vocational Behavior, 58(2), 293–312. https://doi.org/10.1006/jvbe.2000.1765
Uggerslev, K. L., Fassina, N. E., & Kraichy, D. (2012). Recruiting through the stages: A meta-analytic test of predictors of applicant attraction at different stages of the recruiting process. Personnel Psychology, 65(3), 597–660. https://doi.org/10.1111/j.1744-6570.2012.01254.x
Van Hoye, G., & Lievens, F. (2009). Tapping the grapevine: A closer look at word-of-mouth as a recruitment source. Journal of Applied Psychology, 94(2), 341–352. https://doi.org/10.1037/a0014066
Van Esch, P., Black, J. S., & Ferolie, J. (2019). Marketing AI recruitment: The next phase in job application and selection. Computers in Human Behavior, 90, 215–222. https://doi.org/10.1016/j.chb.2018.09.009
Vial, G. (2019). Understanding digital transformation: A review and a research agenda. The Journal of Strategic Information Systems, 28(2), 118–144. https://doi.org/10.1016/j.jsis.2019.01.003
Weathington, B. L., Cunningham, C. J. L., & Pittenger, D. J. (2012). Understanding business research (1. Aufl.). s.l.: Wiley: Wiley. Retrieved from http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=480557https://doi.org/10.1002/9781118342978
Wehner, M. C., Giardini, A., & Kabst, R. (2015). Recruitment process outsourcing and applicant reactions: When does image make a difference? Human Resource Management, 54(6), 851–875. https://doi.org/10.1002/hrm.21640
Weitzel, T., Maier, C., Oehlhorn, C., Weinert, C., Wirth, J., & Laumer, S. (2018). Digitalisierung der Personalgewinnung: Ausgewählte Ergebnisse der Recruiting Trends 2018. Retrieved from Monster Worldwide Deutschland GmbH website: https://arbeitgeber.monster.de/recruiting/studien.aspx
Wiechmann, D., & Ryan, A. M. (2003). Reactions to computerized testing in selection contexts. International Journal of Selection and Assessment, 11(2–3), 215–229. https://doi.org/10.1111/1468-2389.00245
Wilhelmy, A., Kleinmann, M., Melchers, K. G., & Lievens, F. (2018). What do consistency and personableness in the interview signal to applicants? Investigating indirect effects on organizational attractiveness through symbolic organizational attributes. Journal of Business and Psychology, 34(5), 671–684. https://doi.org/10.1007/s10869-018-9600-7
Wilhelmy, A., Kleinmann, M., Melchers, K. G., & Götz, M. (2017). Selling and smooth-talking: Effects of interviewer impression management from a signaling perspective. Frontiers in Psychology, 8.https://doi.org/10.3389/fpsyg.2017.00740
Williams, P., McDonald, P., & Mayes, R. (2021).Recruitment in the gig economy: Attraction and selection on digital platforms. The International Journal of Human Resource Management, 1–27.https://doi.org/10.1080/09585192.2020.1867613
Woods, S. A., Ahmed, S., Nikolaou, I., Costa, A. C., & Anderson, N. R. (2020). Personnel selection in the digital age: A review of validity and applicant reactions, and future research challenges. European Journal of Work and Organizational Psychology, 29(1), 64–77. https://doi.org/10.1080/1359432X.2019.1681401
Youyou, W., Kosinski, M., & Stillwell, D. (2015). Computer-based personality judgments are more accurate than those made by humans. Proceedings of the National Academy of Sciences of the United States of America, 112(4), 1036–1040. https://doi.org/10.1073/pnas.1418680112
Zhao, M., Hoeffler, S., & Dahl, D. W. (2012). Imagination difficulty and new product evaluation. Journal of Product Innovation Management, 29(S1), 76–90. https://doi.org/10.1111/j.1540-5885.2012.00951.x
Acknowledgements
We thank Christina Angele, Tamara Smak, Tamaris Stürzenhofecker, Ann Sophie Wild, and Yaoliang Yuan for their assistance in the data collection process.
Funding
Open Access funding enabled and organized by Projekt DEAL.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Folger, N., Brosi, P., Stumpf-Wollersheim, J. et al. Applicant Reactions to Digital Selection Methods: A Signaling Perspective on Innovativeness and Procedural Justice. J Bus Psychol 37, 735–757 (2022). https://doi.org/10.1007/s10869-021-09770-3
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10869-021-09770-3