The VAP framework describes the discovery phase as: “locating the values relevant to a given project and defining those values within the context of the game” (Flanagan and Nissenbaum 2014). Game designers identify values and sources of values that influence the game. These include the key actors, values, ethical challenges, and any potential technical constraints (Flanagan and Nissenbaum 2014).
To find values levers related to privacy in mobile application design, two members of our team (Greene and Shilton 2018) conducted a critical discourse analysis of conversations about privacy in mobile development forums. Critical discourse analysis is a qualitative method for analyzing how individuals talk about and justify their practices (van Leeuwen 2008). We found that values reflection during application development is influenced by both the work practices of an individual or team and the politics and development culture of the larger platform ecosystem. Practices that opened values conversations included interacting with user analytics, as developers grappled with the (sometimes invasive) meaning of data about their users. Navigating platform approval processes was another lever for privacy conversations, as developers had to debate what kinds of data collection Apple or Google did or did not allow. Confronting technical constraints such as not being able to collect data continuously from phone cameras or microphones also spurred values conversations about why these constraints might exist.
As these examples suggest, analyzing privacy conversations in the mobile ecosystem illustrated the power of platforms to deploy values levers. Through both technical and policy means, Apple encourages frequent iOS developer conversations about privacy, while simultaneously enforcing narrow and problematic “notice and consent” privacy definitions. Google, on the other hand, exerts less overt technical and policy agency, and therefore developers engaged in less-frequent conversations about privacy. But Android developers responded to privacy problems with a wider and more creative range of solutions, because privacy requirements are not pre-defined by the platform (Greene and Shilton 2018). Based on this research, our simulation models both the politics and development culture of a platform ecosystem and the work practices of the team.
Translation is the process of developing game play elements that raise, enact, or help students question values within the game (Flanagan and Nissenbaum 2014). Our translation process focused on constructing simulation elements that would encourage participants to particularize and make decisions about the ethical challenge of what user data to collect.
First, we translated the ethical challenge into an online roleplaying simulation. We created a scenario in which participants are members of a fictional health application development team. The team is charged with porting an existing application from the permissive “Robot” platform (modeled after Android) to the more restrictive “Fruit” platform (modeled after iOS). Participants were tasked with creating two outputs describing a set of (1) policy changes and (2) associated technical changes needed for the transition. This set-up evoked the privacy decisions that real-world developers must make when moving their product between platforms, and also engaged the tensions between the two platforms and their differing privacy policies that we observed in our observational research.
We also assigned participants contrasting team roles, such as the project manager, software developer, or user experience developer, to experiment with team diversity, which had been shown be an important values lever in previous research (Shilton 2013). Participants received short descriptions of their roles as well as subtle hints (shown in bold below) about what that role might particularly value. The Software Manager is told to “lead the team to successful completion of a software project.” The Software Developer “collaborates on the designs and development of technical solutions.” Finally, the User Experience Designer “advocates for the user during the design and development of projects. By giving each role slightly different priorities we hoped to seed explicit values conversations.
Next, we created injects—messages from non-player characters that would be deployed throughout the online simulation—based on factors found in our empirical research. An inject from a fictional friend introduces possible policy constraints by emailing a blog article about HIPAA to participants. An inject from a fictional marketing director introduces third-party data access by asking participants to consider allowing partnership—and user data sharing—with an insurance company. We also experimented with the impact of leader advocacy, an important lever for encouraging values conversations in earlier research (Shilton 2013), by having the head of the legal department express concerns about data breaches.
Finally, we used real-world developer privacy discussions as resources for student participants. Students were directed to forum discussions where software developers had negotiated consensus on the meaning of privacy. We also gave participants other resources to guide the online simulation: a design document specifying the current workings of the app, including how and when it collects personal data; and the “Robot” and “Fruit” application store policy guidelines.
After developing the scenario, roles, injects, and resources, we brought the online roleplaying simulation to life using the ICONS platform (https://www.icons.umd.edu/): a web-based environment that facilitates roleplaying, discussion and deliberation, and decision-making processes. Students had individual ICONS accounts, and when they logged in, were given a role and assigned to a team. A welcome email from a fictional manager described the online simulation task, and students could email each other within the platform to discuss the assignment and their goals. The students could also author proposals on the platform (in this case, describing both policy changes and technical changes), and could vote on others’ proposals. Injects appeared as emails from fictional characters alongside students’ email communication with their team. Table 2 summarizes values levers we found in the empirical research and how we translated them into simulation elements.
Iteration: From Simulation to Board Game
The online roleplaying simulation’s injects are replaced by event cards, which are drawn after every set of privacy decisions. Event cards incorporate values levers discovered during the earlier fieldwork-based discovery phase of the project phase, including interacting with app store guidelines (see Fig. 2), receiving feedback from users, or following changes in law. Event cards mimic actual events observed during the earlier fieldwork.