Ethics and Information Technology

, Volume 17, Issue 4, pp 249–265 | Cite as

Breaking the filter bubble: democracy and design

Open Access
Original Paper

Abstract

It has been argued that the Internet and social media increase the number of available viewpoints, perspectives, ideas and opinions available, leading to a very diverse pool of information. However, critics have argued that algorithms used by search engines, social networking platforms and other large online intermediaries actually decrease information diversity by forming so-called “filter bubbles”. This may form a serious threat to our democracies. In response to this threat others have developed algorithms and digital tools to combat filter bubbles. This paper first provides examples of different software designs that try to break filter bubbles. Secondly, we show how norms required by two democracy models dominate the tools that are developed to fight the filter bubbles, while norms of other models are completely missing in the tools. The paper in conclusion argues that democracy itself is a contested concept and points to a variety of norms. Designers of diversity enhancing tools must thus be exposed to diverse conceptions of democracy.

Keywords

Democracy Filter bubble Selective exposure Design Value sensitive design Diversity Viewpoint diversity 

Introduction

Cyberbalkanization refers to the idea of segregation of the Internet into small political groups with similar perspectives to a degree that they show a narrow-minded approach to those with contradictory views. For instance Sunstein (2007) argued that thanks to the Internet, people could join into groups that share their own views and values, and cut themselves off from any information that might challenge their beliefs. This, according to Sunstein, will have a negative effect on the democratic dialogue. Recently others have argued that personalization algorithms used by online services such as Facebook and Google display users similar perspectives and ideas and remove opposing viewpoints on behalf of the users without their consent (Pariser 2011). According to Pariser (2011), users might get different search results for the same keyword and those with the same friend lists can receive different updates. This is because information can be prioritized, filtered and hidden depending on a user’s previous interaction with the system and other factors (Bozdag 2013; Diakopoulos 2014). This might lead to the situation in which the user receives biased information. In case of political information, it might lead to the situation that the user never sees contrasting viewpoints on a political or moral issue. Users will be placed in a “filter bubble” and they will not even know what they are missing (Pariser 2011). As a consequence, the epistemic quality of information and diversity of perspectives will suffer and the civic discourse will be eroded.

After Pariser’s book has been published, the danger of filter bubbles received wide attention in the media, in academia and in industry. Empirical studies have been conducted to confirm or to debunk its existence. While algorithms and online platforms in general have been criticized because they cause filter bubbles, some designers have developed algorithms and tools to actually combat those bubbles. However, as we will show in this paper, the methods and goals of these tools differ fundamentally. Some try to give users full control and allow them to even increase their bubble. Some modify users’ search results for viewpoint diversity without notifying the user. This is because the filter bubble has become a term that encompasses various criticisms. These criticisms differ because democracy is essentially a contested concept and different democracy models require different norms. As this paper will show, some will criticize the filter bubble due to its negative effect on user autonomy and choice, while others emphasize the diminishing quality of information and deliberation. In this paper we will show that while there are many different democracy theories, only the diversity related norms of a few of them are implemented in the tools that are designed to fight filter bubbles. We will also show that some norms (e.g., the inclusion of minorities in the public debate) are completely missing. We will argue that if we want to fully use the potential of the Internet to support democracy, all these diversity related norms should be discussed and designed, and not just the popular or most dominant ones.

In this paper, we first provide different models of democracy and discuss why he filter bubble pose a problem for these different models. Next, we provide a list of tools and algorithms that designers have developed in order to fight filter bubbles. We will do this by discussing the benchmarks these tools use and the democracy model the tools exemplify. We will show that not all relevant democracy models are represented in the overview of available diversity enhancing tools. Finally, we discuss our findings and provide some recommendations for future work.

Democracy and filter bubbles: different theories, different benchmarks

Democracy refers very roughly to a method of group decision making characterized by equality among the participants at an essential stage of the collective decision making (Christiano 2006). While some models of democracy emphasize the autonomy and individual preferences of those who take part in this collective decision making, others highlight the inclusion of free and equal citizens in the political community and the independence of a public sphere that operates as a middle layer between state and society (Habermas 1998). Some emphasize the need of an informed (online) debate and the epistemic quality of information before decisions are made (Hardin 2009). Others point out the need to increase the reach of minorities and other marginalized groups in the public debate (Young 2002).

While the filter bubble has been a concern for many, there are different answers to the question as to why filter bubbles are a problem for our democracy. The answer one gives to the question depends on one’s understanding of the nature and value of democracy, on one’s conception of democracy. Different democracy theories exist and they have different normative implications and informational requirements. A tool that implements one particular norm will be quite different in its form and goals than another tool which implements a different norm. Before we provide examples of different tools, we will provide a framework of some basic conceptions of democracy and the relevant norms for each model.

Liberal view of democracy

The classical liberal view of democracy attempts to uphold the values of freedom of choice, reason, and freedom from tyranny, absolutism and religious intolerance (Dunn 1979; Held 2006) Liberalism started as a way to challenge the powers of “despotic monarchies” and tried to define a political sphere independent of church and state. Once liberalism achieved victory over these old “absolute powers”, many liberal thinkers, began to express fear about the rising power of the “demos” (Madison 1787; Mill 1859; Held 2006). They were concerned by the new dangers to liberty posed by majority rule against minorities and the risk of the majority ‘tyrannizing over itself, leading to a need for people to ‘limit their power over themselves’.

Bentham (1780) argues that, since those who govern will not act the same way as the governed, the government must always be accountable to an electorate called upon frequently and this electorate should be able to decide whether their objectives have been met. Next to voting, ‘competition’ between potential political representatives, ‘separation of powers’, ‘freedom of the media, speech and public association’ should be ensured to sustain ‘the interest of the community in general’ (Bentham 1780). Individuals must be able to pursue their interests without the risk of arbitrary political interference, to participate freely in economic transactions, to exchange labor and goods on the market and to appropriate resources privately.

The liberal view of democracy is often criticized, because it construes democracy as an aggregation of individual preferences through a contest (in the form of voting), so that the preferences of the majority win the policy battle. However, this model has no way of distinguishing normatively legitimate outcomes from the preferences and the desires of the powerful, and makes no distinction between purely subjective preferences and legitimate and shared (quasi objective) judgments (Cohen 1997, 2009; Young 2002).

Filter bubbles are a problem according to the liberal view, because the non-transparent filters employed by online algorithms limit the freedom of choice. In addition, the liberal view states that citizens must be aware of different opinions and options, in order to make a reasonable decision. A filter imposed on users—unbeknownst to them—will violate their autonomy, as it will interfere with their ability to choose freely, and to be the judge of their own interests. Further, the principle of separation of powers and the freedom of the media can also be in danger, if the algorithms are designed in such a manner as to serve the interests of certain individuals or groups. Finally, filters might damage the “liberty of thought”. Liberty of thought, discussion and action are the necessary conditions for the development of independence of mind and autonomous judgment. Liberty of thought creates reason and rationality, and in turn the cultivation of reason stimulates and sustains liberty. If one is ‘coerced’ by the filters, reason will also diminish. While some thinkers such as Mill (1859) also emphasize the diversity of opinion, most liberal thinkers do not mention this as a requirement. Liberal citizens must be ‘potentially’ informed so that the elected act accountably, but deliberation according to the liberal view is not necessary. Loss of autonomy caused by filters seems to be the main issue, according to the liberal view, while diversity of opinions and perspectives is not a concern.

Deliberative democracy

Elster (1997) characterizes deliberative democracy as “decision making by discussion among free and equal citizens”. Deliberative democrats propose that citizens address societal problems and matters of public concern by reasoning together about how to best solve them. This can be made possible by deliberative procedures, which help to reach a moral consensus that satisfies both rationality (defense of liberal rights) and legitimacy (as represented by popular sovereignty) (Gutmann and Thompson 2004). Individuals participating in the democratic process can change their minds and preferences as a result of reflection. According to Cohen (2009), deliberative democracy can be seen (1) as a matter of forming a public opinion through open public discussion and translating that opinion into legitimate law; (2) as a way to ensure elections are themselves infused with information and reasoning; (3) as a way to bring reasoning by citizens directly to bear on addressing regulatory issues. In all cases the goal is to use the common reason of equal citizens who are affected by decisions, policies or laws, instead of having them enter into bargaining processes or represent them by means of the aggregation of their individual preferences. Democracy, no matter how fair, no matter how informed, no matter how participatory, does not qualify as deliberative unless reasoning is central to the process of collective decision-making.

There are different versions of deliberative democracy. Rawls’ (1971, 1997) conception of deliberation is based on the idea of public reason, which is defined as “the basic moral and political values that are to determine a constitutional democratic government’s relation to its citizens and their relation to one another”. By means of public deliberation, people settle their disputes with respect and mutual recognition towards each other. Habermas (1998) provides similar conditions in his concept of the “ideal speech situation”. The Rawlsian approach aims at ‘accommodation’ of differences in a pluralistic society without criticizing people’s fundamental views of life, their so-called ‘comprehensive doctrines’ or ‘bringing them into deliberative discussion’. Habermas’ approach does the opposite, by also making moral or philosophical ideas and ideals part of the deliberative challenge. Both Rawls and Habermas advocate a ‘rational consensus’ rather than ‘mere agreement’ in political deliberation. For this purpose, Rawls uses the term ‘reasonable’, and Habermas introduces the notion of ‘communicative rationality’.

Deliberative democrats argue that deliberation (1) enlarges the pools of ideas and information (Cohen 2009), (2) helps us discover truths (Manin 1997; Talisse 2005), (3) can lead us to a better grasp of facts (Hardin 2009), (4) can lead us to discover diverse perspectives, practical stances towards the social world that are informed by experiences that agents have (Bohman 2006), (5) can help us discover the seriousness of our disagreements and discover that there is a disagreement after all (Cohen 1986), (6) can lead to a consensus on the “better or more “reasonable” solution (Landemore 2012), (7) promotes justice, as it requires full information and equal standing, (8) lead to better epistemic justification and legitimacy than simply voting (Hardin 2009). This is because political decisions based on deliberation are not simply a product of power and interest. It involves public reasons to justify decisions, policies or laws, (9) lead to better arguments, since citizens have to defend their proposals with reasons that are capable of being acknowledged as such by others (Cohen 2009), (10) allows citizens to reflect on their own arguments, that will lead to self-discovery and refinedarguments (Cohen 1986), (11) promotes respect, as it requires people to consider the opinions of others, despite fundamental differences of outlook (Hardin 2009).

Critics of deliberative democracy argue that full fledged deliberation is difficult to attain because (1) there is inequality in deliberative capabilities of citizens, which gives advantages to the rhetorically gifted and those who possess cultural capital and argumentative confidence in leading the discussions (Ahlström 2012), (2) there is widespread incompetence and political ignorance among the masses (Ahlström 2012), (3) voters are not interested in the common good, but only in self-interests (Caplan 2008), (4) people are biased and may hold beliefs without investigation. Majority rule will amplify these mistakes and make democratic decisions worse (Caplan 2008), (5) while participation of citizens is possible in small nations, vast numbers of people will inevitably entail deterioration of participation (Held 2006). Past a certain threshold, deliberation turns into a chaotic mess (Landemore 2012), (6) most citizens cannot spend the time to master the issues well enough to take meaningful stands on major issues. The information processing cost and transaction cost is too high (den Hoven 2005), (7) deliberation among like-minded users can cause polarization. When people deliberate on a relatively homogenous argument pool, they consolidate fairly easily, which is bad for outsiders. Evidence from social psychology suggests that it is the viewpoints of the majority, not of the informed minorities, that can be expected to drive the relevant group judgments (Ahlström 2012). The informed minorities may refrain from disclosing what they know due to social pressure and be reluctant to dissent, thus not submitting the information to deliberation (Sunstein 2007), (8) forcing participants to deliberation with limiting their arguments due to commonly shared rational premises, public reason or common good will prevent dissenting voices to share their perspectives and identities on their own terms (Young 2002).

Filter bubbles are a problem for deliberative democrats, mainly because of the low quality of information and the diminishing of information diversity. If bubbles exist, the pool of available information and ideas will be less diverse and discovering new perspectives, ideas or facts will be more difficult. If we only get to see the things we already agree with on the Internet, discovering disagreement and the unknown will be quite difficult, considering the increasing popularity of the Internet and social media as a source of political information and news (Mitchell et al. 2014). Our arguments will not be refined, as they are not challenged by opposing viewpoints. We will not contest our own ideas and viewpoints and as a result, only receive confirming information. This will lead us not to be aware of disagreements. As a consequence, the quality of arguments and information and respect toward one other will suffer.

Republicanism and contestatory democracy

In contemporary political theory and philosophy, republicanism focuses on political liberty, understood as non-domination or independence from arbitrary power. The republican conception of political liberty defines freedom as a sort of structural independence—as the condition of not being subject to arbitrary or uncontrolled power. Pettit (1999) argues that people are free to the extent that no other group has “the capacity to interfere in their affairs on an arbitrary basis”. To ensure that, according to Pettit (1999), there must be an “active, concerned citizenry who invigilate the exercise of government power, challenge its abuses and seek office where necessary”. In this theory, freedom as non-domination supports a conception of democracy where contestability takes the place usually given to consent. The most important implication is not that the government does what the people want, but that people can always contest whatever decision the government has taken. While the republican tradition does not overlook the importance of democratic participation, the primary focus is clearly on avoiding the evils associated with interference and oppression.

Pettit (1999) argues that the media has a major role in forming the public opinion, ensuring non-domination and the possibility of effective contestation. However, Pettit argues, the media often fail badly in performing these roles. According to Pettit, at every site of decision-making (legislative, administrative and judicial), there must be procedures in place to identify and display the considerations relevant to the decision. The citizens should be able to contest these decisions if they find that the considerations did not actually determine the outcome. The decisions must be made under transparency, under threat of scrutiny, and under freedom of information. A group, even if they are a minority, should be able to voice contestation and must be able to speak out in a way that is liable to affect the proposed legislation. They must be able to contest in an effective manner, and they must be able to make themselves heard in decision-making quarters. To provide this, there must be reliable channels of publicity and information in place, so that the performance of the governing parties is systematically brought to attention.

If we apply these norms to the design of online platforms, we can argue that online information platforms (1) must make the right information available to the citizens and should allow them to track when something important or relevant happens. In this way, citizens can become aware of possible oppression and can become active when they feel there is a need to. This can for instance be achieved by human curation that aims at including important events that might affect the whole of society, in the information diet of everyone. It can also be achieved by means of personalization, so that, an event that is particularly important for a user can be highlighted for that user, (2) provide effective methods of contestation, so that citizens can make themselves heard with their contestations and affect the proposed legislation or policy. This means that people should not only be able to contest, but also that the contestation should reach a large public so that it can result in an effective and inclusive discussion.

Filter bubbles are a problem for advocates of contestatory democracy, because they interfere with realization of both conditions mentioned above. Bubbles both block the incoming and outgoing information channels. In order to raise critical questions, one must be aware of something that is a candidate for contestation. Someone cannot protest if they do not know that things relevant to them are happening. A filter bubble can block the reliable channels of publicity and information and may increase the risk that citizens are unaware of important news. Filter bubbles prevent awareness of both the items that people could disagree with and the information on the basis of which they could justify their reasons for disagreeing. Furthermore it may turn out to be much more difficult to communicate and share ideas with potentially like minded others outside your filter bubble. For not every post or comment on Facebook will reach your followers and a website with key information might never make it to the top of one’s Google’s search results.

Agonism/inclusive political communication

While most deliberative democracy models aim for consensus concerning a ‘common interest’, agonists see politics as a realm of conflict and competition and argue that disagreement is inevitable even in a well-structured deliberative democratic setting, and even if the ideal of consensus regulates meaningful dialogues (Mouffe 2009). According to these critics, different and irreconcilable views will coexist and an overlapping final consensus can never be achieved. Having consensus as the main goal and the refusal of a vibrant clash of democratic but opposing political positions will lead to “apathy and disaffection with political participation” (Mouffe 1999; Young 2002). According to Mouffe (2009), the aim of democratic politics according to advocates of this agonistic conception of democracy should not be seen as overcoming conflict and reaching consensus, because such a consensus would actually be a consensus of the hegemony.

The aim of ‘agonistic pluralism’ then, is to construct the ‘them’ (opposing viewpoint) in such a way that it is no longer perceived as an enemy to be destroyed, but as an ‘adversary’. Thus, conflict must be in center stage in politics and it must only be contained by democratic limits.

An adversary is “somebody whose ideas we combat but whose right to defend those ideas we do not put into question” (Mouffe 2009). Agonistic pluralism requires providing channels through which collective passions will be given ways to express themselves over issues which, while allowing enough possibility for identification, will not construct the opponent as the enemy. The difference with “deliberative democracy” is that ‘agonistic pluralism’ does not eliminate passions from the sphere of the public, in order to reach a consensus, but mobilizes those passions towards democratic designs. Democracy should then be designed so that conflict is accommodated and unequal power relations and hegemony in the society is revealed.

Mouffe (1999) argues that although the advocates of deliberative democracy claim to address pluralism and the complexity of the society, their reference to reason and rationality tends to exclude certain groups from the political arena; therefore, they are essentially not pluralistic.

Similarly, Young (2002) argues that if consensus becomes the ultimate goal, some difficult issues or issues that only concern a minority might be removed from discussion for the sake of agreement and preservation of the common good (Young 2002). The idea of a generalized and impartial public interest that transcends all difference, diversity and division is problematic, because the participants in a political discussion most likely differ in social position or culture. Our democracies contain structural inequalities (e.g., wealth, social and economic power, access to knowledge, status). Some groups have greater material privilege than others, or there might be socially or economically weak minorities. Therefore in such settings “the common good” is likely to express the interests and perspectives of the dominant groups (Young 2002). The perspectives and demands of the less privileged may be asked to be put aside for the sake of a common good whose definition is biased against them.

Young (2002) argues that when there are structural conflicts of interest which generate deep conflicts of interest, processes of political communication are more about struggle than about agreement. However, according to Young, the field of struggle is not equal; some groups and sectors are often at a disadvantage. Fair, open, and inclusive democratic processes should then attend to such disadvantages and institutionalize compensatory measures for exclusion. Democratic institutions and practices must take measures explicitly to include the representation of social groups, relatively small minorities, or socially or economically disadvantaged ones. Disorderly, disruptive, annoying, or distracting means of communication are often necessary or effective elements in such efforts to engage others in debate over issues and outcomes. Christiano (2006) argues that due to cultural differences in society, deep cognitive biases make individuals fallible in understanding their own and other’s interests and compare the importance of others’ interest with their own. By default, people will fail to realize equal advancement of interests in society. Thus, special measures must be taken to make sure that equality is satisfied.

Filter bubbles are a problem for agonists and supporters of inclusive political communication, because they hide or remove channels through which opposing viewpoints can clash vibrantly. Minorities, and those who are disadvantaged due to structural inequalities need special exposure to be able to reach out with their voice to larger publics. However, filters that show us what we already agree with usually do not include such minority voices. If filters only show us what they consider “relevant” for us, then, the only way to reach a large public will be through advertisements or by gaming the filters. This will violate the inclusion norm of modern democracies, as only the wealthy who can afford such advertisements, or technologically advanced minds who can use algorithms to their own advantage will be able to express themselves.

Conclusion

Table 1 summarizes the democracy models we have introduced, the benchmarks they require, the points of critique they imply concerning the phenomenon of filter bubble. Liberal democrats stress the importance of self-determination, awareness, being able to make choices and respect for individuals. Filter bubbles are a problem for the liberal democrats especially due to restrictions on individual liberty, restrictions on choice and the increase in unawareness. Deliberative democracy attempts to increase information quality, discover the truth, discover facts, discover perspectives and discover disagreements. This in the end leads to better epistemic justifications, better arguments and it increases legitimacy and respect towards one other. The filter bubble, according to deliberative democrats, hurts the civic discourse, mutual understanding and sensemaking. Contestatory democracy on the other hand focuses on channels that allow citizens to be able to contest effectively, if there is a need. It does not aim for deliberation, but it requires citizens to have key information on important issues, and be aware of the oppressors. In contestatory democracy, the media should thus provide reliable channels of publicity, so that the performance of the governing parties is systematically brought to attention and can be contested. The filter bubble is a problem for contestatory democracy, because it removes the reliable channels so that key information on both topics and grounds of contestation cannot be sent and received. Agonists criticize the consensus goal of deliberative democrats and argue that other norms such as inclusion should also be the goal of democracy. They argue that special attention must be paid to the voice of minorities and other disadvantaged members of society and by making sure that dissent is continuously present. The filter bubble is a problem for agonists, because it will silence radical voices, will only reflect the viewpoints and perspectives of the mainstream and it will change agonism to antagonism.
Table 1

Models of democracy and design criteria

Model of democracy

Norms

Criticism of the filter bubble

Liberal

Awareness of available preferences

Self-determination

Autonomy

Adaptive preferences

Free media

Respect human dignity

User is unaware of the availability of options

User is restrained and individual liberty is curtailed

The media is not free, it serves the interests of certain parties (e.g. advertisers)

Powers are not separated (advertiser and the information provider are the same)

Deliberative

Discover facts, perspectives and disagreements

Determine common interests

Construct identity by self-discovery

Refine arguments and provide better epistemic justifications

Consensus

Respect towards each other’s opinions

A collective spirit

Free and equal participants

Rationality

Epistemic quality of information suffers

Civic discourse is undermined

No need to have better epistemic justifications

Respect for other opinions is decreased

Legitimacy is more difficult to achieve. There is a loss of a sense of an informational commons

Communication suffers as gaining mutual understanding and sense-making is undermined

Republican and contestatory

Freedom from domination by oppressors

Contest matters effectively

Be aware of the oppressors

Diminishes one’s ability to contest

Diminishes one’s awareness of the oppressors and their potentially manipulative interventions

Agonistic/inclusive political communication

Conflict rather than consensus

Passions rather than rationality

Struggle rather than agreement

Inclusion: Measures must be taken to explicitly include the representation of social groups, relatively small minorities, or socially or economically disadvantaged ones

Measures must be taken so that antagonism is transformed into agonism

The adversary becomes the enemy

The minorities are excluded from the democratic process, their voices are lost

Software design to combat filter bubbles

Many activists, including Pariser (2011) have suggested to users that they should sabotage personalization systems by erasing web history, deleting cookies, using the incognito option, trying other search engines and fooling the personalization system either by entering fake queries or liking everything ever produced by your friends. However, these options are not only tedious, but they are bad for the user as well. As we will show in this section, personalization algorithms and other tools can actually also be designed and used to broaden a user’s worldview.

As we have seen in “Democracy and filter bubbles: different theories, different benchmarks” section, while filter bubbles should be seen as worrying developments in the digital world from the point of view of democracy, different conceptions and models of democracy point to different undesired consequences of such bubbles, ranging from loss of autonomy to the diminishing epistemic quality of information. In recent years, various tools have been developed by computer scientists either in the industry or in academia to fight filter bubbles. However, as designers hold different values and are assuming different models of democracy either implicitly or explicitly, the tools they develop will reflect those values and democracy models. As has become sufficiently clear in recent studies of ethics of technology (Friedman et al. 2006), technology is not neutral and the values and biases that designers hold will manifest themselves in the end product.

In order to identify the state of the art tools and designs and analyze which criteria and methods they employ, we have created a carefully curated list. To come up with this list, between January 2014 and June 2014, we have performed the following inquiries: (1) we have checked the academic articles that cite Munson and Resnick (2010), one of the first papers that designed an experiment and created a tool to fight the filter bubble, in the HCI community, (2) we have frequently followed HCI researchers on Twitter and included the tools/experiments they have mentioned on the filter bubble, (3) We have used Google search engine with specific keywords to find non-academic tools, including “filter bubble”, “design”, “selective exposure”. This gave us in total 15 tools/designs.

In this section, we will show that, the different interpretations of the phenomenon filter bubble have led to different designs, tools and empirical studies. These tools differ in their goals ranging from personal fulfillment and development of cultural taste to promotion of tolerance and intercultural understanding. We will show that, some of the tools even allow the user to increase filter bubbles. The tools also differ in their methods, ranging from modifying users’ newsfeeds/search results without their notice to visualizing bubbles to increase user awareness. We will show that, while their methods differ, the benchmarks they use to break the filter bubble can be the same. We will also show that, a design can include criteria from multiple democracy conceptions that we discussed in the previous section.

Liberal/user autonomy enhancing

As we have stated in “Liberal view of democracy” section, in the liberal view of democracy, filter bubbles can be seen as a form of market failure that diminishes user control and hence autonomy, hide available options and coerce people in such a way that they cannot get what they want. Users will not get the search results they were looking for, or do not receive the updates from friends they want to in a social networking platform. Designers that takes this view will develop tools that aim to promote awareness of filter bubbles and attempt to give users some sense of control. User satisfaction and awareness of options and choice seem to be the most common goals. As we will show in this subsection, this view of the filter bubble can be realized by giving users the control over the filters, increasing awareness of their own biases or increasing the awareness of the presence of filters that are implemented in common web services.

Munson et al. (2013) developed a browser tool called Balancer, that tracks users’ reading activities and shows their reading behavior and biases, in order to increase awareness (See Also Fig. 1b). Munson et al. argue that, while many people agree that reading a diverse set of news is good, many do not realize how skewed their own reading behavior is. Balancer therefore shows an approximate histogram of the user’s liberal and conservative pages, with the hope that the given feedback will nudge users to make their reading behavior more balanced. Munson et al. (2013) found that very low number of users changed their reading habits (conservatives consuming more liberal items and liberals more conservative). The majority of the users did not change their reading habits at all. While Balancer aims for users to reflect their preferences and on the long-term increase the epistemic quality of the incoming information, the primary goal is to increase user-awareness. Hence this tool belongs to the user autonomy enhancing technologies that are motivated by a liberal conception of democracy.
Fig. 1

a Scoopinion (2014), a browser add-on that displays user’s news consumption habits. Larger circles are news outlets that the user consumed the most items. b Balancer (Munson et al. 2013) is a browser add-on that shows users their biases. In this picture the user is biased towards reading from liberal news outlets

Scoopinion1 is a browser add-on that tracks news sites and the type of stories one reads while using the browser. Scoopinion (See Fig. 1a) provides a visual summary of one’s reading habits by displaying user’s media fingerprint. The tool also personalizes recommended stories based upon user’s reading habits, but by displaying the media fingerprint, it assumes that the user will choose to read more diversely. It works with a white list of news sites and does not make diverse recommendations. It provides a visualization of users’ information consumption habit to increase their autonomy, but it has no clear goals such as tolerance or better information quality. Again this fits a liberal conception of democracy and prioritizes the value of choice autonomy.

Xing et al. (2014) developed a browser add-on called Bobble that allows users to compare their Google search results with other profiles worldwide. The tool (See Fig. 2) uses hundreds of nodes to distribute a user’s Google search queries worldwide each time the user performs a Google search. For example, when a user performs a Google search with keyword “Obamacare”, this search keyword is distributed to 40+ worldwide Bobble clients that perform the same Google search and return corresponding search returns. Users then can see which results are displayed on their browser, but not on others, and vice versa. It is a tool for users to get an idea of the extent of personalization taking place. The tool aims to increase user’s awareness of Google’s filters. However, it does not aim to increase deliberation or provide challenging information by its design.
Fig. 2

Bobble (Xing et al. 2014) displays a user Google search results that only they received (in yellow) and results that they have missed but others have received (in red). (Color figure online)

Nagulendra and Vassileva (2014) developed a visualization design to display to users their filter bubbles (Fig. 3). This “control and visualization” tool helps users understand how information filtering works in an online peer-to-peer social network. The tool shows users which categories and friends are in their bubble and which ones are not. Further, it allows them to control the algorithm by manipulating the visualization to “escape” the bubble, namely adding/removing friends on a certain topic to the filters. The tool aims to maximize users’ control over their filter bubbles, increase awareness of the filter bubble, promote understandability of the filtering mechanism and ultimately increase user satisfaction. It, however, does not make an attempt to expose users into challenging information. If the user wants to remain in a bubble, the tool will allow them to do that. Also in this case, a liberal notion of democracy with an emphasis on user autonomy is at the background of the development of this tool.
Fig. 3

Nagulendra and Vassileva (2014)’s software allows users to control their filter bubbles

Deliberative/enhancing epistemic quality of information

As we have mentioned in “Deliberative democracy” section, filter bubbles can be seen as a problem, not because they prevent users getting what they want, but because they diminish the quality of the public discussion. Deliberative democracy assumes that users are, or should be, exposed to diverse viewpoints, so that they can discover disagreements, truths, perspectives and finally make better decisions. Polarized users or users exposed to low quality (but agreeable and relevant) information will have bad consequences. In order to increase the epistemic quality of information, a wide range of opinions and perspectives on a particular topic may be made more visible and users can compare their opinions with others, even if they are opposing their own views. In the end, respect, legitimacy and consensus can be reached. In this subsection, we will list some of the tools that allow users to discover different viewpoints by visualization, showing pro/con arguments for a controversial topic, nudging them to listen to others, or by diversifying search results by modifying them for political search queries.

Microsoft’s search engine Bing studied the effect of used language for nudging Bing Search engine users (Yom-Tov et al. 2013). In this study (which we will simply refer as “the Bing Study”), a sample of 179,195 people who used news related queries were selected and then their political behavior and their link click pattern were observed. Researchers found that, while 81 % (76 %) of Republicans (Democrats) click on items from one of the most polarized outlets of their own view, they rarely clicked on polarized outlets of the other side (4 and 6 % respectively), suggesting a filter bubble in action. The researchers then modified the Bing search engine’s results page. They matched Democratic to Republican-leaning queries on the same topic manually (e.g., Obamacare and affordable health care). They then modified the results for the queries for a subset of people who issued them (treatment group), resulting in a diversified set of results: the results contained items from both republican and liberal sources, regardless of what the user has searched for. This did not increase the number of clicks on items from the opposing political news outlets. However, when the authors chose websites that use a language similar to the user’s own language, they observed a change of 25 % toward the center. The authors thus conclude that when the language model of a document is closer to an individual’s language model, it has a higher chance of being read despite it describing an opposite viewpoint. The researchers aimed for “increasing exposure to varied political opinions with a goal of improving (and enhancing) civil discourse” (Yom-Tov et al. 2013).

ConsiderIt (Kriplean et al. 2012; Freelon et al. 2012) is a deliberation (pro/con) tool that is developed with the aims of (1) helping people learn about political topics and possible tradeoffs between different opinions, (2) nudging them toward reflective consideration of other voters’ thoughts, (3) enabling users to see how others consider tradeoffs. ConsiderIt (Fig. 4) provides an interface where users can create pro/con lists by including existing arguments others have contributed, to contribute new points themselves, and to use the results of these personal deliberations to expose salient points by summarizing their stance rather than a yes/no vote. Users can see ranked lists of items that were popular full opposers, firm opposers, slight opposers, neutrals, slight supporters, firm supporters and full supporters. In a pilot study called “The Living Voters Guide” (LVG), the system was put into testing during the 2010 Washington state elections that had certain proposals on areas of tax, sale of alcohol, candy or bottled water, state debt, bail and other political topics. In LVG, 8823 unique visitors browsed the site and 468 people submitted a position on at least one item. In a small survey of 7 users, 46.3 % of them have reported that they have actually changed their stances on at least one measure and 56 % of them saying they switched from support to oppose or vice versa. 32 % of them have reported that they moderated their stances and 12 % saying they strengthened them (Kriplean et al. 2012).
Fig. 4

ConsiderIt (Kriplean et al. 2012; Freelon et al. 2012) helps people learn about political topics and possible tradeoffs between different opinions

OpinionSpace (Faridani et al. 2010) plots on a two-dimensional map the individual comments in a web forum, based on the commenters’ responses to a short value-based questionnaire. By navigating this space, readers are better able to seek out a diversity of comments as well as prime themselves for engaging the perspective of someone with different values (Fig. 5). When users interrogate an individual comment, they are prompted to rate comments for how much they agree with and respect it. The size of the comment’s dot on the map then grows when people with different values than the speaker respect and/or agree with it, facilitating users in seeking out comments that resonate widely.
Fig. 5

Opinionspace (Faridani et al. 2010) allows users to browse a diverse set of ideas, see responses from like-minded participants or responses from participants who differ in opinion

Reflect (Kriplean et al. 2011) modifies the comments of webpages in order to encourage listening and perspective taking. It adds a listening box next to every comment, where other users are encouraged to succinctly restate the points that the commenter is making, even if there is disagreement (Fig. 6). This is a nudge to listen to other users. Other readers can afterwards read the original comment and other listeners’ interpretations of what was said, supporting broader understanding of the discussion. In this way, users do not have to “like” or “recommend” the comment to recognize or appreciate the speaker. By nudging towards listening and reflecting, an empathetic and constructive normative environment is formed, where not only those who speak and reflect are positively affected, but those who read as well. In mid-September 2011, the popular online discussion platform Slashdot enabled Reflect on four stories. During the trial, 734 reflections were written by 247 discussants, an average of 1.0 reflection per comment. While flaming and pure replies were present (31 %), the majority of the reflections were neutral, different neutral interpretations or meta observations. The tool also allowed the community to rate reflections, making certain reflections under a threshold invisible. After users downvoted flaming or cheeky replies on those reflections, almost 80 % of all the visible reflections were neutral reflections.
Fig. 6

Reflect (Kriplean et al. 2011) nudges users to listen to each other by making them restate the points that the commenter is making, even if there is disagreement

Rbutr2 is a community driven Chrome add-on, that informs a user when the webpage they are viewing has been disputed, rebutted or contradicted elsewhere on the Internet (Fig. 7). Users can add opposing viewpoints for an item, so that future users will see that an opposing viewpoint exists for the item they are reading. Rbutr aims to increase information quality and informed opinions by promoting fact and logic-checking.
Fig. 7

Rbutr is a Chrome add-on that informs a user when the webpage they are visiting has been disputed

There are other tools and studies that aim to increase epistemic quality of information. Liao and Fu (2013, 2014) studied the effect of the perceived threat, the level of topic involvement, and the effect of expertise and position indicators. Munson and Resnick (2010) studied the effect of nudging by sorting or highlighting agreeable news items and experimenting with the ratio of challenging and agreeable news items. Newscube (Park et al. 2009, 2011) is a tool that detects different aspects of a news using keyword analysis, and displays users news items with different perspectives in order to decrease media bias. Hypothes.is3 is a community peer-review tool that allows the users to highlight text and add comments and sentence-level critic. Political Blend (Doris-Down et al. 2013) is a mobile application that matches people with different political views and nudges them to have a cup of coffee face to face and discuss politics.

Table 2 summarizes our analysis of the studied tools.
Table 2

Tools that are developed to combat filter bubbles, the benchmarks they use and the models they belong to

Model

Examples

Design criteria (benchmarks)

Liberal

Balancer, Scoopinion, Bobble, Nagulendra and Vassileva’s control and visualization tool

Allow users to be aware of their own (and the platform’s) biases

Understand biases

Allow the user to control incoming information and filters

Deliberative

Bing Study, ConsiderIt, OpinionSpace, Rbutr, Newscube, Political Blend

Discover diverse facts, perspectives and disagreements

Reflection on own (and others’) arguments

Aim for informed debate with epistemic justifications

Increase the epistemic quality of information

Discussion

One of the key finding of our analysis is that the norms specified by agonistic and contestatory models of democracy are completely missing in all of the tools that aim to fight the filter bubble. While it is possible to come across critical voices, disadvantaged views or contestation using tools such as OpinionSpace or ConsiderIt, it is also highly likely that these voices and views get lost among the “popular” items, which are of interest to the majority of the audience. However, as McQuail and van Cuilenburg (1983) have argued, media should not only proportionally reflect differences in politics, religion, culture and social conditions, but provide equal access to their channels for all people and all ideas in society. If the population preferences were uniformly distributed over society, then satisfying the first condition (reflection) would also satisfy the second condition (equal access). However, this is seldom the case (Van Cuilenburg 1999). Often population preferences tend toward the middle and to the mainstream. In such cases, the media will not satisfy the openness norm, and the view of minorities will not reach a larger public. This is undesirable, because social change usually begins with minority views and movements (van Cuilenburg 1999).

In modern democracies, some citizens are able to buy sufficient media time to dominate public discussion, while others are excluded. If the political outcomes result from an exclusive process, where those with greater power or wealth are able to dominate the process, then from the point of view of democratic norms that outcome is illegitimate. However, even if people are formally included in the democratic process, inclusion issues arise if they are not taken seriously or treated with respect. The dominant party may find their arguments not worthy enough for consideration. Then, people, while they formally have a chance to express their ideas, actually lack an effective opportunity to influence the thinking of others. Van Cuilenburg (1999) argues that the Internet has to be assessed in terms of its ability to give open access to new and creative ideas, opinions and knowledge that the old media do not cover yet. Otherwise it will only be more of the same. Recent research shows that equal access might be a problem on the Internet as well. Bozdag et al. (2014) studied the diversity of political information for Dutch and Turkish Twitter users, by analyzing about 2000 users for each country and studying around 10 million tweets. According to Bozdag et al. (2014), while minorities in both countries produce roughly the same amount of tweets, they cannot reach a very significant amount of Turkish users, while they can in the Dutch Twittersphere.

Several agonistic design attempts have been developed in the industry throughout the years to reveal hegemony (one of the requirements of agonistic design). Most of these tools perform social network analysis to identify actors and their connections (networks of force) and represent the multifaceted nature of hegemony. For instance the project Mulksuzlestirme (dispossession in Turkish) compiles data collectively and then uses mapping and visualization techniques to show the relations between the capital and power within urban transformations in Turkey. The interactive map (See Fig. 8) displays the established partnerships between the government and private developers and shows to which investors collected taxes have been transmitted through the redevelopment/privatization of public spaces.4 For instance, it shows that one corporation that is involved in many government projects also owns major news organizations in the country, including the Turkish version of the CNN. By means of visualization, the designer allows users to browse and discover interesting relationships between the media and corporations to reveal hegemony.
Fig. 8

Screenshot from Mulksuzlestirme (dispossession) project. The map shows the connections between a corporation, several media outlets that it owns and urban transformation projects that it has received

While tools such as Mulksuzlestirme might reveal key information for political debates and elections, many of these tools are not widely known. Tools like these can spread in unfiltered platforms such as Twitter, if powerful actors and opinion leaders can spread them through their followers (Jürgens et al. 2011). However, Twitter has stated that it plans to deploy a personalized algorithmic timeline in the future (Panzarino 2014). If one wants their message to spread in a filtered/personalized platform, it has to bypass the filters or perhaps trick them. In order to accomplish this, one either has to pay for advertisements (and must hence possess the necessary financial means) or one must have the technical skills (such as search engine optimization). Many people do not have either of these means, but yet, they might have key information that is vital for contestation. Further, we could not find designs/tools that implement other benchmarks of agonism, such as special attention to minority voices.

We do not know why only norms of liberal and deliberative democracy models are represented in the tools that are developed to break filter bubbles. This might be due to a lack of designers’ exposure to different democracy theories for the designers. It can also be the case that the designers are aware of all the models and implied norms, but choose to implement only certain ones in design. We have no evidence of reasoned choices to this effect on the part of the designers. Future work, e.g. such as interviewing the designers could shed some light into this issue. However, the body of literature concerning democratic theory shows that there is a great variety in conceptions of democracy, as one would expect with central philosophical notions, that we use to think about and order society, such as equity, justice, property, privacy and freedom. These are essentially contested concepts. As John Dewey has observed long before the Internet, social media and other platforms were invented, democracy is a central concept and it implies an ongoing cooperative social experimentation process (Anderson 2006). Dewey was of the opinion that we live in an ever-evolving world that requires the continuous reconstruction of ideas and ideals to survive and thrive. The idea of democracy is no exception in this respect (Garrison 2008). Therefore, it seems that the online intermediaries that fulfill a public role must take necessary measures to open to and ready to experiment with a plurality of democracy models, including ones that propagate agonistic and contestatory elements. It is possible that these two models of democracy are not quite popular and that this explains that designers are not aware of the norms and benchmarks implied by these models. It might be beneficial if the designers are exposed to a variety of conceptions and models of democracies, in order to come to realize that that each models has strengths and weaknesses.

An information intermediary could include agonistic and contestatory elements in its design by (1) Ensuring that minorities and other marginalized groups receive special attention, so that they can reach a larger audience. This must be designed carefully, as research shows that minority views are usually ignored by the majority and the alternative voice only has a formal, but not a meaningful place in the debate (Witschge 2008), (2) Providing mechanisms and channels of publicity and, so that the performance of the relevant parties (e.g., the government) is known. This would include highlighting information on important political issues and put it in user’s newsfeed/search result, even if the algorithm would normally not do so, in order to make users aware of the oppressors, (3) Designing platforms for effective contestation. If key information is present, this must ideally reach the relevant users, so that they also can contest the decision makers, (4) Allowing people to be notified or alerted when something important/relevant happens, thus not only commercially relevant, but politically as well, (5) Designing the tools in a way that opposing viewpoints are actually considered and reflected upon. Otherwise simply showing contradictory views might lead to flaming (Diakopoulos and Naaman 2011), (6) Emphasizing to the user that algorithmic selection is always a contest, one that is choosing from contrary perspectives. This could be done by showing that the selected viewpoint is a selection out of many possible ones (Crawford 2013), (7) Always offering the ability to choose between real alternative viewpoints, not just the dominant ones.

Recent studies indicate that most people are unaware of filters in social media (Eslami et al. 2015; Rader and Gray 2015). We can thus expect that the tools that we have mentioned in this paper are not widely known. Major online platforms such as Google and Facebook often argue that they are not a news platform, they do not have an editorial role and therefore they will not design algorithms to promote diversity. For instance, Facebook’s project management director for News Feed states: “there’s a line that we can’t cross, which is deciding that a specific piece of information–be it news, political, religious, etc.—is something we should be promoting. It’s just a very, very slippery slope that I think we have to be very careful not go down.” (Luckerson 2015). However, research shows that these platforms are increasingly used to receive diverse opinions. According to a recent study in the US, nearly half (48 %) of the 10,000 panelists say they accessed news about politics and government on Facebook alone in the past week (Mitchell et al. 2014). A more recent study indicates that 86 % of the Millennials usually turn to social media to receive diverse opinions, more than any other media (American Press Institute 2014). Between 2010 and 2012, the traffic to news sites from various social media platforms grew by 57 % and leading news organizations get around a quarter of site visits from the social networking platform, some even 40 % (Pentina and Tarafdar 2014; Lafrance 2015; Meyer 2015). If we also consider the dominant position of these platforms in the search and social media markets worldwide (White 2015; Rosoff 2015; Sterling 2015; Whittaker 2015), we can argue that these platforms are indeed important news and opinion sources.

If we consider these platforms as important news and opinion sources, then we can argue that they should aim to increase viewpoint diversity, a value that is deemed important by almost all democracy models. They could adapt and experiment with the tools that we have listed in “Software design to combat filter bubbles” section. Experimenting seems unavoidable as the current design attempts to break the bubbles are all experimental. Breaking bubbles requires an interdisciplinary approach, as several disciplines including human–computer interaction, multimedia information retrieval, media and communication studies or computer ethics have all something to contribute in the design of diversity-sensitive algorithms. More experiments with different contexts will need to be conducted in order to find which techniques work and which do not. Once we have more concrete results, the systems could apply different strategies for different types of users. While these different designs to fight the filter bubble are very valuable to understand how users’ attitudes can be changed to remedy polarization, the actual goal must be more explicit and must be better supported with theory and public deliberation. Otherwise, user autonomy might be diminished, and in turn, the honesty and trustworthiness of the platforms could be questioned.

Conclusion

In this paper, we have pointed out, that different democracy theories emphasize different aspects of the filter bubble, whether it is the loss of autonomy, the decrease in the epistemic quality of information, losing the ability for effective contestation or losing effective channels that display the performance of the governing bodies’. Most tools that aim to fight the bubbles do not define the filter bubble explicitly. They also do not reveal their goals explicitly or simply define it as “hearing the other side”. Further, most of those studies are performed for US politics. As some democracy theorists and communication scholars argue, viewpoint diversity is improved not only by aiming for consensus and hearing pro/con arguments, but also allowing the minorities and marginal groups to reach a larger public or by ensuring that citizens are able to contest effectively. As we have mentioned earlier, minority reach could be a problem in social media for certain political cultures.

Our findings indicate that the majority of the tools that we have studied to combat filter bubbles are designed with norms required by liberal or deliberative models of democracy in mind. More work is needed to reveal designers’ understanding of democracy and to see whether they are aware of different norms. As we have shown in this paper, all models have their weaknesses. It would thus be beneficial if the designers were exposed to other conceptions of democracy to realize that there is not just one model. As democracy itself is an ongoing cooperative social experimentation process, it would be beneficial for all to experiment with different norms of different conceptions and theories of democracy and not just the popular ones.

Footnotes

Notes

Compliance with ethical standards

Conflict of interest

The authors have no conflict of interest to declare.

References

  1. Ahlström, K. (2012). Why deliberative democracy is (still) untenable. Public Affairs Quarterly, 26(3). http://philpapers.org/rec/AHLWDD-3.
  2. American Press Institute. (2014). The personal news cycle: How Americans choose to get their news. http://www.americanpressinstitute.org/publications/reports/survey-research/personal-news-cycle/.
  3. Anderson, E. (2006). The epistemology of democracy. Episteme, 3(1–2), 8–22. doi:10.3366/epi.2006.3.1-2.8.CrossRefGoogle Scholar
  4. Bentham, J. (1780). An introduction to the principles of morals and legislation. http://www.econlib.org/library/Bentham/bnthPML.html.
  5. Bohman, J. (2006). Deliberative democracy and the epistemic benefits of diversity. Episteme, 3(03), 175–191. doi:10.3366/epi.2006.3.3.175.CrossRefGoogle Scholar
  6. Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and Information Technology, 15(3), 209–227.CrossRefGoogle Scholar
  7. Bozdag, E., Gao, Q., Houben, G. J., & Warnier, M. (2014). Does offline political segregation affect the filter bubble? An empirical analysis of information diversity for Dutch and Turkish Twitter users. Computers in Human Behavior, 41, 405–415.CrossRefGoogle Scholar
  8. Caplan, B. (2008). The myth of the rational voter: Why democracies choose bad policies. New edition. Princeton, NJ; Woodstock: Princeton University Press. http://www.amazon.com/The-Myth-Rational-Voter-Democracies/dp/0691138737.
  9. Christiano, T. (2006). Democracy. Stanford Encyclopedia of Philosophy, July 27. http://plato.stanford.edu/entries/democracy/.
  10. Cohen, J. (1986). An epistemic conception of democracy. Ethics, 97(1), 26–38. http://www.jstor.org/stable/2381404.
  11. Cohen, J. (1997). Deliberation and democratic legitimacy. In J. Bohman & W. Rehg (Eds.), Deliberative democracy: Essays on reason and politics. Cambridge: MIT Press.Google Scholar
  12. Cohen, J. (2009). Reflections on Deliberative Democracy. In T. Christiano & J. Christman (Eds.), Contemporary debates in political philosophy. West-Sussex: Blackwell.Google Scholar
  13. Crawford, K. (2013). Can an algorithm be agonistic? Ten scenes about living in calculated publics. In Governing algorithms 2013.Google Scholar
  14. den Hoven, J. (2005). E-democracy, E-contestation and the monitorial citizen. Ethics and Information Technology, 7(2), 51–59. doi:10.1007/s10676-005-4581-4.CrossRefGoogle Scholar
  15. Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism Brief, Columbia University. http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:ALGORITHMIC+ACCOUNTABILITY+REPORTING+:+ON+THE+INVESTIGATION+OF+BLACK+BOXES#0.
  16. Diakopoulos, N., & Naaman, M. (2011). Towards quality discourse in online news comments. In Proceedings of the ACM 2011 conference on computer supported cooperative work (pp. 133–142). doi:10.1145/1958824.1958844.
  17. DiSalvo, C. (2012). Adversarial design. Cambridge: MIT Press.Google Scholar
  18. Doris-Down, A., Husayn, V., & Gilbert, E. (2013). Political blend: An application designed to bring people together based on political differences. In Proceedings of international conference on communities and technologies (C&T) (pp. 120–130). doi:10.1145/2482991.2483002.
  19. Dunn, J. (1979). Western political theory in the face of the future (Vol. 3). Cambridge: Cambridge University Press. http://books.google.nl/books/about/Western_Political_Theory_in_the_Face_of.html?id=vw3tKOTVG0AC&pgis=1.
  20. Elster, J. (1997). The market and the forum: Three varieties of political theory. In J. Bohman & W. Rehg (Eds.), Deliberative democracy: Essays on reason and politics (pp. 3–34). Cambridge: The MIT Press. doi:10.1177/019145370102700505.Google Scholar
  21. Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., et al. (2015). ‘I always assumed that i wasn’t really that close to [her]’: Reasoning about invisible algorithms in the news feed. In Human factors in computing systems conference (CHI), 2015. Seoul, Korea.Google Scholar
  22. Faridani, S., Ephrat, B., Kimiko, R., & Ken, G. (2010). Opinion space: A scalable tool for browsing online comments. In Proceedings of the 28th international conference on human factors in computing systems (pp. 1175–1184). doi:10.1145/1753326.1753502.
  23. Freelon, D. G., Kriplean, T., Morgan, J., Bennett, W. L., & Borning, A. (2012). Facilitating diverse political engagement with the living voters guide. Journal of Information Technology & Politics, 9(3), 279–297. doi:10.1080/19331681.2012.665755.CrossRefGoogle Scholar
  24. Friedman, B., Kahn Jr., P. H., & Borning, A. (2006). Value sensitive design and information systems: Three case studies. In Human-Computer interaction and management information systems: Foundations. New York: M. E. Sharp Inc.Google Scholar
  25. Garrison, J. (2008). Reconstructing democracy and recontextualizing Deweyan pragmatism. In J. Garrison (Ed.), Reconstructing democracy, recontextualizing Dewey: Pragmatism and interactive constructivism in the twenty-first century (pp. 1–17). Albany: State University of New York Press. Google Scholar
  26. Gutmann, A., & Thompson, D. (2004). Why deliberative democracy? Princeton, NJ: Princeton University Press. http://www.amazon.com/Why-Deliberative-Democracy-Amy-Gutmann/dp/0691120196.
  27. Habermas, J. (1998). Between facts and norms: Contributions to a discourse theory of law and democracy. Cambridge: MIT Press. http://books.google.com.tr/books?id=4n9AiZtPq5YC.
  28. Hardin, R. (2009). Deliberative Democracy. In T. Christiano & J. Christman (Eds.), Contemporary debates in political philosophy. West-Sussex: Blackwell.Google Scholar
  29. Held, D. (2006). Models of democracy (3rd ed.). Stanford: Stanford University Press.Google Scholar
  30. Jürgens, P., Jungherr, A., & Schoen, H. (2011). Small worlds with a difference: New gatekeepers and the filtering of political information on Twitter. In WebSci’11.Google Scholar
  31. Kriplean, T., Morgan, J., Freelon, D., Borning, A., & Bennett, L. (2012). Supporting reflective public thought with ConsiderIt. In Proceedings of the ACM 2012 conference on computer supported cooperative workCSCW’12 (p. 265). New York, NY: ACM Press. doi:10.1145/2145204.2145249.
  32. Kriplean, T., Toomim, M., Morgan, J. T., Borning, A., & Ko, A. J. (2011). REFLECT: Supporting active listening and grounding on the web through restatement. In Computer supported cooperative work (CSCW). http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:REFLECT+:+Supporting+Active+Listening+and+Grounding+on+the+Web+through+Restatement#0.
  33. Lafrance, A. (2015). Facebook is eating the Internet. The Atlantic.Google Scholar
  34. Landemore, H. (2012). Democratic reason: The mechanisms of collective intelligence in politics. In H. Landemore & J. Elster (Eds.), Collective wisdom: Principle and mechanisms. Cambridge: Cambridge University Press. Google Scholar
  35. Liao, Q. V., & Fu, W. T. (2013). Beyond the filter bubble: Interactive effects of perceived threat and topic involvement on selective exposure to information. In Proceedings of the SIGCHI conference on human factors in computing systems. doi:10.1145/2470654.2481326.
  36. Liao, Q. V., & Fu, W. T. (2014). Expert voices in echo chambers: Effects of source expertise indicators on exposure to diverse opinions. In Proceedings of the 32nd annual ACM conference on human factors in computing systems (pp. 2745–2754).Google Scholar
  37. Luckerson, V. (2015). Here’s how Facebook’s news feed actually works. Time. http://time.com/3950525/facebook-news-feed-algorithm/.
  38. Madison, J. (1787). Federalist 10. The Federalist Papers, no. 10 (pp. 1–7). http://www.brucesabin.com/pdf_files/readings/Federalist_10.pdf.
  39. Manin, B. (1997). The principles of representative government. Cambridge: Cambridge University Press. http://www.cambridge.org/us/academic/subjects/politics-international-relations/political-theory/principles-representative-government.
  40. McQuail, D., & van Cuilenburg, J. J. (1983). Diversity as a media policy goal: A strategy for evaluative research and a Netherlands case study. International Communication Gazette, 31(3), 145–162.CrossRefGoogle Scholar
  41. Meyer, R. (2015). Facebook as a Press Baron. The Atlantic.Google Scholar
  42. Mitchell, A., Gottfried, J., Kiley, J., & Matsa, K. (2014). Political polarization & media habits. http://www.journalism.org/files/2014/10/Political-Polarization-and-Media-Habits-FINAL-REPORT-11-10-14-2.pdf.
  43. Mouffe, C. (1999). Deliberative democracy or agonistic pluralism? Social Research, 66(3), 745–758.Google Scholar
  44. Mouffe, C. (2009). The democratic paradox. London: Verso.Google Scholar
  45. Munson, S. A., Lee, S. Y., & Resnick, P. (2013). Encouraging reading of diverse political viewpoints with a browser widget. In International conference on weblogs and social media (ICWSM), Boston.Google Scholar
  46. Munson, S. A., & Resnick, P. (2010). Presenting diverse political opinions: How and how much. In Proceedings of the SIGCHI conference on human factors in computing systems, CHI 2010 (pp. 1457–1466). New York, New York, USA.Google Scholar
  47. Nagulendra, S., & Vassileva, J. (2014). Understanding and controlling the filter bubble through interactive visualization: A user study. In HT 14 (pp. 107–115).Google Scholar
  48. Panzarino, M. (2014). Twitter’s timeline could get (more) algorithmic. TechCrunch.Google Scholar
  49. Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. New York: Penguin Press.Google Scholar
  50. Park, S., Kang, S., Chung, S., & Song, J. (2009). NewsCube: Delivering multiple aspects of news to mitigate media bias. In Proceedings of CHI’09, the SIGCHI conference on human factors in computing systems (pp. 443–453). doi:10.1145/1518701.1518772.
  51. Park, S., Lee, K., & Song, J. (2011). Contrasting opposing views of news articles on contentious issues. In Proceedings of the 49th annual meeting of the association for computational linguistics: Human language technologies (Vol. 1, pp. 340–349). https://www.aclweb.org/anthology-new/P/P11/P11-1035.pdf.
  52. Pentina, I., & Tarafdar, M. (2014). From ‘information’ to ‘knowing’: Exploring the role of social media in contemporary news consumption. Computers in Human Behavior, 35, 211–223. doi:10.1016/j.chb.2014.02.045.CrossRefGoogle Scholar
  53. Pettit, P. (1999). Republicanism: A theory of freedom and government. Oxford: Oxford University Pres.CrossRefGoogle Scholar
  54. Rader, E., & Gray, R. (2015). Understanding user beliefs about algorithmic curation in the Facebook news feed. In CHI’15 proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 173–182). New York: ACM. http://dl.acm.org/citation.cfm?id=2702174.
  55. Rawls, J. (1971). A theory of justice. Harvard: Harvard University Press.Google Scholar
  56. Rawls, J. (1997). The idea of public reason. In J. Bohman & W. Rehg (Eds.), Deliberative democracy: Essays on reason and politics (p. 447). Cambridge: MIT Press.Google Scholar
  57. Rosoff, M. (2015). Here’s how dominant Google is in Europe. Business Insider.Google Scholar
  58. Sterling, G. (2015). Bing reaches 20 percent search market share milestone in US. Search Engine Land.Google Scholar
  59. Sunstein, C. R. (2007). Republic.com 2.0. First Edit. Princeton: Princeton University Press.Google Scholar
  60. Talisse, R. B. (2005). Deliberativist responses to activist challenges: A continuation of Young’s dialectic. Philosophy & Social Criticism,. doi:10.1177/0191453705052978.Google Scholar
  61. Van Cuilenburg, J. (1999). On competition, access and diversity in media, old and new some remarks for communications policy in the information age. New Media & Society, 1(2), 183–207.CrossRefGoogle Scholar
  62. White, A. (2015). Google accused of abusing power on search as Android probed. Bloomberg. http://www.bloomberg.com/news/articles/2015-04-15/eu-accuses-google-of-antitrust-violations-starts-android-probe.
  63. Whittaker, Z. (2015). Facebook Q1: Mixed earnings; 1.44 billion monthly active users. ZDNet. Retrieved from http://www.zdnet.com/article/facebook-q1-2015-earnings.
  64. Witschge, T. (2008). Examining online public discourse in context: A mixed method approach. Javnost—The Public, 15(2), 75–91.Google Scholar
  65. Xing, X., Meng, W., & Doozan, D. (2014). Exposing inconsistent web search results with bobble. In PAM 2014.Google Scholar
  66. Yom-Tov, E., Dumais, S., & Guo, Q. (2013). Promoting civil discourse through search engine diversity. Social Science Computer Review. doi:10.1177/0894439313506838.
  67. Young, I. M. (2002). Inclusion and democracy (Oxford political theory). Oxford: Oxford University Press.CrossRefGoogle Scholar

Copyright information

© The Author(s) 2015

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  1. 1.Faculty Values, Technology and InnovationDelft University of TechnologyDelftThe Netherlands

Personalised recommendations