Skip to main content

Critical Human Security and Cyberspace: Enablement Besides Constraint

  • Chapter
  • First Online:
Digitalisation and Human Security

Part of the book series: New Security Challenges ((NSECH))

  • 393 Accesses

Abstract

This chapter applies a critical human security perspective to cyberspace. It contends that emancipation provides human security with its core purpose, as it frees people both from the constraints imposed by despotic regimes (freedom from fear) and from an exploitative economic system (freedom from want). The idea that human emancipation can challenge the pre-existing order is reflected in the literature that foresees the digital revolution ushering in an age of transparency and empowerment. Conceiving security as emancipation, this chapter utilises the conceptual tool of the “three faces” of power to examine how human agency is enabled by the digital revolution (freedom from) as well as being simultaneously constrained by it (oppressed by).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Booth (2007) has subsequently added to this original definition of emancipation by including the caveat that one’s freedom cannot come at the expense of others and, further, that emancipation serves as a means of understanding the world we inhabit and as a framework for actualising change. I quote Booth’s original definition here to show that a critical approach to human security, even though it was not self-identified as such, predates the UN Human Development Report.

  2. 2.

    Once algorithms are conceived of as restricting access and constraining human agency, challenging those elites who use algorithms for this purpose becomes an example of the open face of power.

  3. 3.

    On 25 May 2018, the General Data Protection Regulation (GDPR), a new EU law, came into effect, which changes how companies can collect personal data and how that data can be used. It applies to all companies that offer their services to EU citizens, even if they are based outside the EU. Failure to abide by the requirements of the GDPR can result in a fine of 20 million euros or 4% of a company’s global turnover. The severity of this punishment explains the deluge of emails users received from companies that hold their data requesting that the users review their data privacy policies. In terms of the secretive power of framing how humans can use the internet, it makes the ways in which companies intend to use personal data more explicit (consent must be given by ticking a box, for example) and this consent must be clearly distinguishable from other matters in the terms of service. It is also not legal to require additional personal data in return for extra or premium services, as this does not constitute freely given consent. Users also have the right to access the data held about them and to require errors to be rectified and erased. The right to erase accurate data can also be requested.

    Within hours of the GDPR coming into effect, a privacy protection group called none-of-your-business (noyb) filed four complaints against Facebook, Google, Instagram and WhatsApp. They accused these companies of forcing users to consent to targeted advertising to use their services; if that consent was not given, they prevented users from using their services. Max Schrems, chairman of noyb, claimed that this did not amount to people being given a free choice (noyb, 2018). Whether or not the complaint will be upheld is likely to rest upon whether social networking sites or instant messaging services can make the argument that processing users’ data for targeted advertising is a necessary requirement for the service they provide. That is, that processing data for targeted advertising is a core feature of what it means to share in cyberspace. If this argument is not persuasive, then their processing of data for targeted advertising will require separate consent that the user can decline without being denied use of the service (Hern, 2018). This could have significant consequences for the economic model that underpins SNS, but in terms of the secretive power, because consent is more explicitly required, it is likely to make users more aware that their data are as much shared as protected.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alan Collins .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 The Author(s)

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Collins, A. (2020). Critical Human Security and Cyberspace: Enablement Besides Constraint. In: Salminen, M., Zojer, G., Hossain, K. (eds) Digitalisation and Human Security. New Security Challenges. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-48070-7_4

Download citation

Publish with us

Policies and ethics