Abstract
The most important resource to improve technologies in the field of artificial intelligence is data. Two types of policies are crucial in this respect: privacy and data-sharing regulations, and the use of surveillance technologies for policing. Both types of policies vary substantially across countries and political regimes. This chapter examines how authoritarian and democratic political institutions can influence the quality of research in artificial intelligence, and the availability of large-scale datasets to improve and train deep learning algorithms. We focus mainly on the case of China, and find that—ceteris paribus—authoritarian political institutions continue to have a negative effect on innovation. They can, however, have a positive effect on research in deep learning, via the availability of large-scale datasets that have been obtained through government surveillance. We propose a research agenda to study which of the two effects might dominate in a race for leadership in artificial intelligence between countries with different political institutions, such as the USA and China.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
While Deep Blue beat Garry Kasparov in 1997 by using brute force computing, this approach does not work for Go, a game with 250 possible moves each turn, and a typical game depth of 150 moves, resulting in about 250150, or 10360 possible moves—more than the number of atoms in the observable universe.
- 2.
Lee (2018, page 3) compares the effect of Deep Mind’s win against Lee Sedol to America’s Sputnik moment: “Overnight, China plunged into an artificial intelligence fever. The buzz did not quite rival America’s reaction to Sputnik, but it lit a fire under the Chinese technology community that has been burning ever since”.
- 3.
As a proxy for research quality in artificial intelligence, we use the Nature Index for the year 2020. The index counts publications in applied artificial intelligence published in Nature and its sub-journals, such as Science, the Proceedings of the National Academy of Sciences (PNAS), and others. The list can be accessed under https://www.natureindex.com/faq, and the source of our data can be found here: https://www.natureindex.com/supplements/nature-index-2020-ai/tables/countries. As a proxy for the overall quantity of publications in artificial intelligence, we use the Nature Index Dimensions database, which counts all publications in AI from a specific country between 2015 and 2019: https://www.natureindex.com/supplements/nature-index-2020-ai/tables/dimensions-countries.
- 4.
A certain amount of censorship, surveillance and government control can of course also be found in democratic political systems. We therefore assume a continuum of repressiveness, from complete freedom to complete state control, with the negative effects on creativity increasing with growing government control.
- 5.
See also “The Panopticon is Already Here”, The Atlantic, September 2020, https://www.theatlantic.com/magazine/archive/2020/09/china-ai-surveillance/614197/.
- 6.
As China does not have elections and is still using repression, it is sometimes not seen as a classic “informational autocracy” as invoked by Guriev and Treisman (2019, 2020). However, China has—more than any other country—perfected the strategic censorship of the internet (King et al. 2013, 2014; Roberts 2018), as well as modern authoritarian surveillance technologies (Kostka 2019; Kostka and Antoine 2020; Strittmatter 2020). Following Ringen (2016) and Minzner (2018), we maintain that it can therefore be argued that China has taken the idea of an “informational autocracy” to the next level, by combining sophisticated surveillance and censorship techniques with targeted repression.
- 7.
Whereas repression has a directly negative effect on the ability of the public to meet and protest, censorship limits the ability of the public to communicate, to access independent information, and to subsequently coordinate collective action.
- 8.
- 9.
- 10.
A detailed discussion of this trade-off for the case of Russia and China can be found in Libman and Rochlitz (2019), chapter 4.
- 11.
The Streisand effect occurs when an effort to conceal information increases its value or makes it more attractive (Hobbs and Roberts 2018). The effect is named for Barbra Streisand, who when trying to remove pictures of her home from the internet attracted even more attention to them.
- 12.
In a series of interviews with foreign investors carried out in Shanghai in May 2016 by one of the authors of this study, nearly all investors complained that internet censorship and slow connections are rendering their activities significantly more difficult. See also “China Internet Restrictions Hurting Business, Western Companies Say”, Wall Street Journal, 12.02.2015, https://www.wsj.com/articles/BL-CJB-25952.
- 13.
According to an article published in The Atlantic, in 2013 Peking University charged $1.50 a month for unlimited domestic internet use, but $14.50 for unlimited access to the World Wide Web (“How Internet Censorship Is Curbing Innovation in China”, The Atlantic, 22.04.2013, https://www.theatlantic.com/china/archive/2013/04/how-internet-censorship-is-curbing-innovation-in-china/275188/).
- 14.
“Russian academics decry law change that threatens scientific outreach”, Nature, 12.02.2021, https://www.nature.com/articles/d41586-021-00385-5.
- 15.
In Western democracies, digital surveillance technologies are being used for example to deliver targeted campaign ads during election campaigns in what (Tufekci 2014) calls “computational politics”, or when personal data is used for commercial purposes in what has become known as “surveillance capitalism” (Zuboff 2019).
- 16.
See also “China uses cover of Covid to expand Big Brother surveillance and coercion”, The Times, 25.04.2021, https://www.thetimes.co.uk/article/china-uses-cover-of-covid-to-expand-big-brother-surveillance-and-coercion-ndpz3klmw.
- 17.
For an overview of other theoretical perspectives on creativity and innovation, see Anderson et al. (2014).
- 18.
A related focus is proposed by Cerasoli et al. (2014), who review 40 years of research on external incentives and intrinsic motivation. While we focus on external constraints and intrinsic motivation, Cerasoli et al. (2014, page 983) are in line with our argumentation when they show that intrinsic motivation is a strong predictor of performance, but suffers from “crowding-out” when external incentives are too directly tied to performance outcomes.
- 19.
- 20.
Other studies relying on different theoretical frameworks identify a similar importance of autonomy as catalyst to enable creative performance, see e.g. Li et al. (2018).
- 21.
An internal locus of control means that a person is autonomously able to control her or his actions, while an external locus of control signifies that the person’s actions are determined by an outside actor or institution.
- 22.
Here the difference between controlling and informational limits lies in how an external constraint of behavior is framed. When the activities of subjects were met with “shoulds” and “musts,” their intrinsic motivation and creative performance were reduced. Conveying the same behavioral constraint with compassion and without external pressure had no negative effect on intrinsic motivation.
- 23.
- 24.
“China declared world’s largest producer of scientific articles”, Nature, 18.01.2018, https://www.nature.com/articles/d41586-018-00927-4.
- 25.
During the last 30 years, the Chinese bureaucratic system has proven that it can very efficiently incentivize state officials to exert effort to reach pre-determined performance criteria, such as economic growth (Jia et al. 2015; Li and Zhou 2005; Libman and Rochlitz 2019; Rochlitz et al. 2015; Yao and Zhang 2015). Schedlinsky et al. (2020) however show experimentally that even for relatively simple tasks, surveillance can reduce the motivation and hence the effort exerted. Hence, more research is needed to better understand how surveillance can affect performance with respect to complex and less complex tasks in authoritarian environments, and if—for example—a difference exists between performance in academic and scientific environments, and performance in bureaucratic settings.
- 26.
According to a recent report by China File, central and local Chinese governments spent US$ 2.1 billion between 2016 and 2020 to buy surveillance cameras, “State of Surveillance: Government Documents Reveal New Evidence on China’s Efforts to Monitor Its People”, China File, 30.10.2020, https://www.chinafile.com/state-surveillance-china.
- 27.
“What China Expects from Businesses: Total Surrender”, The New York Times, 19.07.2021, https://www.nytimes.com/2021/07/19/technology/what-china-expects-from-businesses-total-surrender.html.
- 28.
References
Acar, O. A., Tarakci, M., & van Knippenberg, D. (2019). Creativity and innovation under constraints: A cross-disciplinary integrative review. Journal of Management, 45(1):96–121.
Acemoglu, D. & Robinson, J. A. (2006). Economic origins of dictatorship and democracy. New York: Cambridge University Press.
Aho, B. & Duffield, R. (2020). Beyond surveillance capitalism: Privacy, regulation and big data in Europe and China. Economy and Society, 49(2):187–212.
Amabile, T. M. (1983). The social psychology of creativity: A componential conceptualization. Journal of Personality and Social Psychology, 45(2), 357–376.
Amabile, T. M. (1988). A model of creativity and innovation in organisations. Research in Organisational Behaviour, 10, 123–167.
Amabile, T. M. (1996). Creativity in context: Update to the social psychology of creativity. Boulder: Westview Press.
Amabile, T. M. & Pratt, M. G. (2016). The dynamic componential model of creativity and innovation in organizations: Making progress, making meaning. Research in Organizational Behavior, 36, 157–183.
Amabile, T., Goldfarb, P., & Brackfield, S. C. (1990). Social influences on creativity: Evaluation, coaction, and surveillance. Creativity Research Journal, 3, 6–21.
Anderson, N., Potočnik, K., & Zhou, J. (2014). Innovation and creativity in organizations. Journal of Management, 40(5), 1297–1333.
Arenal, A., Armuña, C., Feijoo, C., Ramos, S., Xu, Z., & Moreno, A. (2020). Innovation ecosystems theory revisited: The case of artificial intelligence in China. Telecommunications Policy, 44(6), 101960.
Attanasi, G., Chessa, M., Gallen, S. G., & Llerena, P. (2020). A survey on experimental elicitation of creativity in economics. GREDEG Working Papers, 20.
Aydin, B. (2021). Politically motivated precarization of academic and journalistic lives under authoritarian neoliberalism: The case of Turkey. Globalizations, 19(5), 677–695.
Beraja, M., Yang, D., & Yuchtman, N. (2021). Data-intensive innovation and the state: Evidence from AI firms in China. NBER Working Paper No. 27723.
Blaydes, L. (2018). State of repression: Iraq under Saddam Hussein. Princeton: Princeton University Press.
Bory, P. (2019). Deep new: The shifting narratives of artificial intelligence from Deep Blue to AlphaGo. Convergence: The International Journal of Research into New Media Technologies, 25(4), 627–642.
Buckley, N., Reuter, J., Rochlitz, M., & Aisin, A. (2022). Staying out of trouble: Criminal cases against Russian mayors. Comparative Political Studies. https://doi.org/10.1177/00104140211047399.
Bueno de Mesquita, B., Smith, A., Siverson, R. M., & Morrow, J. D. (2003). The power of political survival. Boston: MIT Press.
Cao, C., Fred Simon, D., and Suttmeier, R. P. (2009). China’s innovation challenge. Innovation, 11(2), 253–259.
Cerasoli, C. P., Nicklin, J. M., & Ford, M. T. (2014). Intrinsic motivation and extrinsic incentives jointly predict performance: a 40-year meta-analysis. Psychological Bulletin, 140(4), 980–1008.
Chen, Y. & Cheung, A. S. Y. (2017). The transparent self under big data profiling: Privacy and Chinese legislation on the social credit system. The Journal of Comparative Law, 12(2), 356–378.
Deci, E. L. (1975). Intrinsic motivation. Boston: Springer.
Deci, E. L. & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior. Boston: Springer.
Deci, E. L. & Ryan, R. M. (1987). The support of autonomy and the control of behavior. Journal of Personality and Social Psychology, 53(6), 1024–1037.
Deci, E. L. & Ryan, R. M. (1990). A motivational approach to self: Integration in personality. Nebraska Symposium on Motivation. Nebraska Symposium on Motivation, 38, 237–288.
Diamond, L. (2010). Liberation technology. Journal of Democracy, 21(3), 69–83.
Ding, J. (2018). Deciphering Chinas AI-dream. Oxford: Centre for the Governance of AI, Future of Humanity Institute, University of Oxford.
Dixon, L., Ristenpart, T., & Shrimpton, T. (2016). Network traffic obfuscation and automated internet censorship. IEEE Security & Privacy, 14(6), 43–53.
Domingos, P. (2015). The master algorithm: How the quest for the ultimate learning machine will remake our world. London: Penguin Books.
Edmond, C. (2013). Information manipulation, coordination, and regime change. The Review of Economic Studies, 80(4), 1422–1458.
Egorov, G. & Sonin, K. (2020). The political economics of non-democracy. CEPR Discussion Paper, 2020, 1–55.
Enyedi, Z. (2018). Democratic backsliding and academic freedom in Hungary. Perspectives on Politics, 16(4), 1067–1074.
Enzle, M. E. & Anderson, S. C. (1993). Surveillant intentions and intrinsic motivation. Journal of Personality and Social Psychology, 64(2), 257–266.
Feldstein, S. (2019). The road to digital unfreedom: How artificial intelligence is reshaping repression. Journal of Democracy, 30(1), 40–52.
Fieberg, C., Hesse, M., Loy, T., & Metko, D. (2022). Machine learning in accounting research. In Hornuf, L. (Ed.). Diginomics research perspectives: The role of digitalization in business and society. Cham: Springer International Publishing.
Foucault, M. (1995). Discipline and punish: The birth of the prison. New York: Vintage Books.
Froming, W. J., Walker, G., & Lopyan, K. J. (1982). Public and private self-awareness: When personal attitudes conflict with societal expectations. Journal of Experimental Social Psychology, 18(5), 476–487.
Gallagher, M. & Hanson, J. K. (2015). Power tool or dull blade? Selectorate theory for autocracies. Annual Review of Political Science, 18, 367–385.
Gao, X., Qiu, M., & Liu, M. (2021). Network traffic obfuscation and automated internet censorship. 2021 8th IEEE International Conference on Cyber Security and Cloud Computing (CSCloud), pp. 149–154.
Graham, L. R. (1987). Science, philosophy, and human behavior in the Soviet Union. New York: Columbia University Press.
Graham, L. R. (1993). Science in Russia and the Soviet Union: A short history. New York: Cambridge University Press.
Gregory, P., Schröder, P., & Sonin, K. (2011). Rational dictators and the killing of innocents: Data from Stalin’s archives. Journal of Comparative Economics, 39(1), 34–42.
Griffiths, J. (2019). The great firewall of China: How to build and control an alternative version of the internet. London: Zed Books.
Gunitsky, S. (2015). Corrupting the cyber-commons: Social media as a tool of autocratic stability. Perspectives on Politics, 13(1), 42–54.
Guriev, S. & Treisman, D. (2019). Informational autocrats. Journal of Economic Perspectives, 33(4), 100–127.
Guriev, S. & Treisman, D. (2020). A theory of informational autocracy. Journal of Public Economics, 186, 104158.
Hagemann, V. & Klug, K. (2022). Human resource management in a digital environment. In Hornuf, L. (Ed.). Diginomics research perspectives: The role of digitalization in business and society. Cham: Springer International Publishing.
Halevy, A., Norvig, P., & Pereira, F. (2009). The unreasonable effectiveness of data. Intelligent Systems, IEEE, 24, 8–12.
Hao, Z. & Guo, Z. (2016). Professors as intellectuals in China: Political identities and roles in a provincial university. The China Quarterly, 228, 1039–1060.
Hennessey, B. A. (2003). The social psychology of creativity. Scandinavian Journal of Educational Research, 47(3), 253–271.
Hennessey, B. A. & Amabile, T. M. (2010). Creativity. Annual Review of Psychology, 61(1), 569–598.
Hey, T. (2009). The fourth paradigm: Data-intensive scientific discovery. Redmond: Microsoft Research.
Hinton, G. E. & Salakhutdinov, R. R. (2006). Reducing the dimensionality of data with neural networks. Science, 313(5786), 504–507.
Hinton, G., Osindero, S., & Teh., Y. (2006). A fast learning algorithm for deep belief nets. Neural Computation, 18(7), 1527–1554.
Hobbs, W. R. & Roberts, M. E. (2018). How sudden censorship can increase access to information. American Political Science Review, 112(3), 621–636.
Howlett, Z. M. (2021). Meritocracy and its discontents: Anxiety and the national college entrance exam in China. Ithaca: Cornell University Press.
Huang, H. & Xu, C. (1999). Institutions, innovations, and growth. Economics of Transition and Institutional Change, 26(3), 335–362.
Jia, K. & Kenney, M. (2016). Mobile internet platform business models in China: Vertical hierarchies, horizontal conglomerates, or business groups? Berkeley Roundtable on the International Economy, BRIE Working Paper No. 2016-6.
Jia, R., Kudamatsu, M., & Seim, D. (2015). Political selection in China: The complementary roles of connections and performance. Journal of the European Economic Association, 13(4), 631–668.
Jia, K., Kenney, M., Mattila, J., & Seppälä, T. (2018). The application of artificial intelligence at Chinese digital platform giants: Baidu, Alibaba and Tencent. Berkeley Roundtable on the International Economy, BRIE Working Paper No. 2018-2, p. 81.
Jiang, J. (2020). The eyes and ears of the authoritarian regime: Mass reporting in China. Journal of Contemporary Asia, 51(5), 828–847.
Josephson, P. R. (2005). Totalitarian science and technology. Amherst: Humanity Books.
Kalgin, A. (2016). Implementation of performance management in regional government in Russia: Evidence of data manipulation. Public Management Review, 18(1), 110–138.
Karpan, A. (2019). Troll factories: Russia’s web brigades. New York: Greenhaven Publishing.
King, G., Pan, J., & Roberts, M. E. (2013). How censorship in China allows government criticism but silences collective expression. American Political Science Review, 107(2), 326–343.
King, G., Pan, J., & Roberts, M. E. (2014). Reverse-engineering censorship in China: Randomized experimentation and participant observation. Science, 345(6199), 1251722.
King, G., Pan, J., & Roberts, M. E. (2017). How the Chinese government fabricates social media posts for strategic distraction, not engaged argument. American Political Science Review, 111(3), 484–501.
Knight, A. & Creemers, R. (2021). Going viral: The social credit system and COVID-19. Working Paper, available at SSRN. https://ssrn.com/abstract=3770208.
Knutsen, C. H. (2015). Why democracies outgrow autocracies in the long run: Civil liberties, information flows and technological change. Kyklos, 68(3), 357–384.
Koestner, R., Ryan, R. M., Bernieri, F., & Holt, K. (1984). Setting limits on children’s behavior: The differential effects of controlling vs. informational styles on intrinsic motivation and creativity. Journal of Personality, 52(3), 233–248.
Kostka, G. (2019). China’s social credit systems and public opinion: Explaining high levels of approval. New Media & Society, 21(7), 1565–1593.
Kostka, G. & Antoine, L. (2020). Fostering model citizenship: Behavioral responses to China’s emerging social credit systems. Policy & Internet, 12(3), 256–289.
Krizhevsky, A. (2009). Learning multiple layers of features from tiny images. Working Paper, University of Toronto.
Krizhevsky, A., Sutskever, I., & Hinton, G. (2012). ImageNet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems, 25, 1097–1105.
Kshetri, N. (2014). China’s data privacy regulations: A tricky tradeoff between ICT’s productive utilization and cybercontrol. IEEE Security & Privacy, 12(4), 38–45.
Kshetri, N. (2017). The evolution of the internet of things industry and market in China: An interplay of institutions, demands and supply. Telecommunications Policy, 41(1), 49–67.
Kshetri, N. (2020). China’s emergence as the global FinTech capital and implications for Southeast Asia. Asia Policy, 27(1), 61–81.
LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521, 436–444.
Lee, K.-F. (2018). AI superpowers: China, silicon valley, and the new world order. Boston: Houghton Mifflin Harcourt.
Lepper, M. R. & Greene, D. (1975). Turning play into work: Effects of adult surveillance and extrinsic rewards on children’s intrinsic motivation. Journal of Personality and Social Psychology, 31(3), 479–486.
Li, H. & Zhou, L.-A. (2005). Political turnover and economic performance: The incentive role of personnel control in China. Journal of Public Economics, 89(9–10), 1743–1762.
Li, H., Li, F., & Chen, T. (2018). A motivational–cognitive model of creativity and the role of autonomy. Journal of Business Research, 92, 179–188.
Liang, F., Das, V., Kostyuk, N., & Hussain, M. M. (2018). Constructing a data-driven society: China’s social credit system as a state surveillance infrastructure. Policy & Internet, 10(4), 415–453.
Libman, A. & Rochlitz, M. (2019). Federalism in China and Russia: Story of success and story of failure? Cheltenham: Edward Elgar.
Linvill, D. L. & Warren, P. L. (2020). Troll factories: Manufacturing specialized disinformation on Twitter. Political Communication, 37(4), 447–467.
Litchfield, R. C., Ford, C. M., & Gentry, R. J. (2015). Linking individual creativity to organizational innovation. The Journal of Creative Behavior, 49(4), 279–294.
Liu, X. (2017). The governance in the development of public universities in China. Journal of Higher Education Policy and Management, 39(3), 266–281.
Medawar, J. & Pyke, D. (2012). Hitler’s gift: The true story of the scientists expelled by the Nazi regime. New York: Arcade Publishing.
Minzner, C. (2018). End of an era: How China’s authoritarian revival is undermining its rise. New York: Oxford University Press.
Mokyr, J. (1992). Lever of riches: Technological creativity and economic progress. Oxford: Oxford University Press.
Montagnes, P. & Wolton, S. (2019). Mass purges: Top-down accountability in autocracy. American Political Science Review, 113(4), 1045–1059.
Moore, W. H. (1998). Repression and dissent: Substitution, context, and timing. American Journal of Political Science, 42(3), 851–873.
Moreno, L. & Pedreno, A. (2021). Europe vs USA and China: How to reverse decline in the age of artificial intelligence. Seattle: Kindle Direct Publishing.
Moskowitz, M. L. (2013). Go nation - Chinese masculinities and the game of Weiqi in China. Berkeley: University of California Press.
Murray, F., Aghion, P., Dewatripont, M., Kolev, J., & Stern, S. (2016). Of mice and academics: Examining the effect of openness on innovation. American Economic Journal: Economic Policy, 8(1), 212–252.
Pan, J. (2020). Welfare for autocrats: How social assistance in China cares for its rulers. Oxford: Oxford University Press.
Pan, J. & Chen, K. (2018). Concealing corruption: How Chinese officials distort upward reporting of online grievances. American Political Science Review, 112(3), 602–620.
Pan, J. & Siegel, A. A. (2020). How Saudi crackdowns fail to silence online dissent. American Political Science Review, 114(1), 109–125.
Perry, E. J. (2020). Educated acquiescence: How academia sustains authoritarianism in China. Theory and Society, 49(1), 1–22.
Peterson, D. (2020). Designing alternatives to China’s repressive surveillance state. Georgetown: Center for Security and Emerging Technology, Georgetown University.
Petrov, N. & Rochlitz, M. (2019). Control over the security services in periods of political uncertainty: A comparative study of China and Russia. Russian Politics, 4(4), 546–573.
Pijetlovic, D. & Mueller-Christ, G. (2022). HumanRoboLab: Experiments with chatbots in management education at universities. In Hornuf, L. (Ed.). Diginomics research perspectives: The role of digitalization in business and society. Cham: Springer International Publishing.
Pittman, T. S., Davey, M. E., Alafat, K. A., Wetherill, K. V., & Kramer, N. A. (1980). Informational versus controlling verbal rewards. Personality and Social Psychology Bulletin, 6(2), 228–233.
Plant, R. W. & Ryan, R. M. (1985). Intrinsic motivation and the effects of self-consciousness, self-awareness, and ego-involvement: An investigation of internally controlling styles. Journal of Personality, 53(3), 435–449.
Qiang, X. (2019), The road to digital unfreedom: President Xi’s surveillance state. Journal of Democracy, 30(1), 53–67.
Qiu, J., Wu, Q., Ding, G., Xu, Y., & Feng, S. (2016). A survey of machine learning for big data processing. EURASIP Journal on Advances in Signal Processing, 67, 1–16.
Ramadan, Z. (2018). The gamification of trust: The case of China’s “social credit”. Marketing Intelligence & Planning, 36(1), 93–107.
Reiss, S. & Sushinsky, L. W. (1975). Overjustification, competing responses, and the acquisition of intrinsic interest. Journal of Personality and Social Psychology, 31(6), 1116–1125.
Righi, R., Samoili, S., López Cobo, M., Vázquez-Prada Baillet, M., Cardona, M., & De Prato, G. (2020). The AI techno-economic complex system: Worldwide landscape, thematic subdomains and technological collaborations. Telecommunications Policy, 44(6), 101943.
Ringen, S. (2016). The perfect dictatorship: China in the 21st century. Hong Kong: Hong Kong University Press.
Roberts, M. E. (2018). Censored: Distraction and diversion inside China’s great firewall. Princeton: Princeton University Press.
Roberts, M. E. (2020). Resilience to online censorship. Annual Review of Political Science, 23(1), 401–419.
Rochlitz, M., Kulpina, V., Remington, T., & Yakovlev, A. (2015). Performance incentives and economic growth: Regional officials in Russia and China. Eurasian Geography and Economics, 56(4), 421–445.
Rosenberg, D. & Tarasenko, G. (2020). Innovation for despots? How dictators and democratic leaders differ in stifling innovation and misusing natural resources across 114 countries. Energy Research & Social Science, 68, 101543.
Ruan, L., Knockel, J., & Crete-Nishihate, M. (2017). We (can’t) chat: “709 Crackdown” discussions blocked on Weibo and WeChat. Citizen Lab Research Report No. 91. Toronto: University of Toronto.
Ryan, R. M. & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist, 55(1), 68–78.
Ryan, R. M. & Deci, E. L. (2019). Brick by brick: The origins, development, and future of self-determination theory. Advances in Motivation Science, 6, 111–156.
Savage, N. (2020). The race to the top among the world’s leaders in artificial intelligence. Nature, 588(7837), 102–104.
Schedlinsky, I., Schmidt, M., & Wöhrmann, A. (2020). Interaction of information and control systems: How the perception of behavior control affects the motivational effect of relative performance information. Accounting, Organizations and Society, 86, 101171.
Schulte, B. (2019). Innovation and control: Universities, the knowledge economy and the authoritarian state in China. Nordic Journal of Studies in Educational Policy, 5(1), 30–42.
Shao, C. (2020). The surveillance experience of Chinese university students and the value of privacy in the surveillance society. PhD Dissertation, Chapel Hill: University of North Carolina.
Silve, F. & Plekhanov, A. (2018). Institutions, innovation and growth: Evidence from industry data. Economics of Transition and Institutional Change, 26(3), 335–362.
Silver, D., Huang, A., Maddison, C. J., Guez, A., Sifre, L., Van Den Driessche, G., Schrittwieser, J., Antonoglou, I., Panneershelvam, V., Lanctot, M., Dieleman, S., Grewe, D., Nham, J., Kalchbrenner, N., Sutskever, I., Lillicrap, T., Leach, M., Kavukcuoglu, K., Graepel, T., and Hassabis, D. (2016). Mastering the game of Go with deep neural networks and tree search. Nature, 529, 484–489.
Silver, D., Schrittwieser, J., Simonyan, K., Antonoglou, I., Huang, A., Guez, A., Hubert, T., Baker, L., Lai, M., Bolton, A., Chen, Y., Lillicrap, T., Hui, F., Sifre, L., van den Driessche, G., & Graepel, Thore Hassabis, D. (2017). Mastering the game of Go without human knowledge. Nature, 550, 354–359.
Smuha, N. A. (2021). From a ‘race to AI’ to a ‘race to AI regulation’: Regulatory competition for artificial intelligence. Law, Innovation and Technology, 13(1), 57–84.
Soldatov, A. and Rochlitz, M. (2018). The siloviki in Russian politics. In Treisman, D. (Ed.). The new autocracy: Information, politics, and policy in Putin’s Russia, pp. 83–108. Washington, DC: Brookings Institution Press.
Spitz, M. (2017). Daten - das Öl des 21. Jahrhunderts? Nachhaltigkeit im digitalen Zeitalter. Hamburg: Hoffman und Campe.
Stern, R. E., Liebman, B. L., Roberts, M. E., & Wang, A. Z. (2021). Automating fairness? Artificial intelligence in the Chinese courts. Columbia Journal of Transnational Law, 59(515), 515–553.
Stokes, R. G. (2000). Constructing socialism: Technology and change in East Germany, 1945-1990. Baltimore: John Hopkins University Press.
Strittmatter, K. (2020). We have been harmonized: Life in China’s surveillance state. New York: Custom House.
Su, Z., Xu, X., & Cao, X. (2021). What explains popular support for government surveillance in China? Journal of Information Technology & Politics, forthcoming.
Sun, C., Shrivastava, A., Singh, S., & Gupta, A. (2017). Revisiting unreasonable effectiveness of data in deep learning era. CoRR, abs/1707.02968.
Svolik, M. (2009). Power sharing and leadership dynamics in authoritarian regimes. American Journal of Political Science, 53(2), 477–494.
Tai, Y. & Fu, K.-w. (2020). Specificity, conflict, and focal point: A systematic investigation into social media censorship in China. Journal of Communication, 70(6), 842–867.
Tang, R. & Tang, S. (2018). Democracy’s unique advantage in promoting economic growth: Quantitative evidence for a new institutional theory. Kyklos, 71(4), 642–666.
Tebaldi, E. & Elmslie, B. (2008). Institutions, innovation and economic growth. MPRA Working Paper, No. 9683.
Tiffert, G. D. (2019). Peering down the memory hole: Censorship, digitization, and the fragility of our knowledge base. The American Historical Review, 124(2), 550–568.
Tufekci, Z. (2014). Engineering the public: Big data, surveillance and computational politics. First Monday, 19(7).
Waldinger, F. (2010). Quality matters: The expulsion of professors and the consequences for PhD student outcomes in Nazi Germany. Review of Economic Studies, 79(2), 2455–2503.
Waldinger, F. (2012). Peer effects in science: Evidence from the dismissal of scientists in Nazi Germany. Journal of Political Economy, 129(99), 838–861.
Wallace, J. L. (2016). Juking the stats? Authoritarian information problems in China. British Journal of Political Science, 46(1), 11–29.
Werbach, K. (2022). Panopticon reborn: Social credit as regulation for the algorithmic age. Champaign:University of Illinois Law Review.
Wong, M. Y. & Kwong, Y.-h. (2019). Academic censorship in China: The case of The China Quarterly. PS: Political Science & Politics, 52(2), 287–292.
Wu, Y., Lau, T., Atkin, D., & Lin, C. (2011). A comparative study of online privacy regulations in the U.S. and China. Telecommunications Policy, 35(7), 603–616.
Xiang, N. (2019). Red AI: Victories and warnings from China’s rise in artificial intelligence. Amazon Digital Services.
Xu, X. (2021). To repress or to co-opt? Authoritarian control in the age of digital surveillance. American Journal of Political Science, 65(2), 309–325.
Xu, X., Kostka, G., & Cao, X. (2021). Information control and public support for social credit systems in China. Working Paper.
Yan, X. (2014). Engineering stability: Authoritarian political control over university students in post-Deng China. The China Quarterly, 218, 493–513.
Yao, Y. & Zhang, M. (2015). Subnational leaders and economic growth: Evidence from Chinese cities. Journal of Economic Growth, 20, 405–436.
Zeng, J. (2020). Artificial intelligence and China’s authoritarian governance. International Affairs, 96(6), 1441–1459.
Zhang, H., Ni, W., Li, J., & Zhang, J. (2020). Artificial intelligence-based traditional Chinese medicine assistive diagnostic system: validation study. JMIR Med Inform, 8(6), e17608.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York: PublicAffairs.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Karpa, D., Klarl, T., Rochlitz, M. (2022). Artificial Intelligence, Surveillance, and Big Data. In: Hornuf, L. (eds) Diginomics Research Perspectives. Advanced Studies in Diginomics and Digitalization. Springer, Cham. https://doi.org/10.1007/978-3-031-04063-4_8
Download citation
DOI: https://doi.org/10.1007/978-3-031-04063-4_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-04062-7
Online ISBN: 978-3-031-04063-4
eBook Packages: Business and ManagementBusiness and Management (R0)