Skip to main content

Evolution, Implementation and Practice of Internet Self-regulation, Co-regulation and Public–Private Collaboration

  • Chapter
  • First Online:
Self- and Co-regulation in Cybercrime, Cybersecurity and National Security

Part of the book series: SpringerBriefs in Cybersecurity ((BRIEFSCYBER))

  • 2007 Accesses

Abstract

This chapter examines the practical evolution of Internet self- and co-regulation and reflects on the current approaches in the fields of cybercrime, cybersecurity and national security. First, it provides insights into the development of self- and co-regulatory approaches to cybercrime and cybersecurity in the multi-stakeholder environment from the beginning of the Internet service provider industry. Second, it highlights the differences concerning the ecosystem of stakeholders involved in each area. Detailed analysis focuses on existing forms of collaboration, highlighting the differences and difficulties in leveraging the forms of cooperation and successful approaches from one area to another. The last part analyses the problems of multi-stakeholder approaches. It will also examine the drawbacks of the existing forms of public–private collaboration, which can be attributed to a specific area (cybercrime, cybersecurity and national security). Ultimately, this chapter provides some suggestions with regard to the way forward in self- and co-regulation in securing cyberspace. It offers unique insights into the role of Internet industry in combatting cybercrime, improving cybersecurity and supporting national security. It identifies the current practical and economic limits on cooperation between industry and state agencies and describes the balance between human rights/data privacy and crime investigation. It concludes with the challenge of state over-reaching their mandate—when national security becomes political interference in human rights.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Additional technical details is available on http://en.wikipedia.org/wiki/Usenet (last accessed 11 Dec 2014).

  2. 2.

    http://textfiles.com/magazines/CUD/cud0862.txt.

  3. 3.

    http://www.computerworld.co.nz/article/519610/british_police_list_133_obscene_newsgroups/ (last accessed 8 Dec 2014).

  4. 4.

    http://www.independent.co.uk/life-style/the-list-and-the-hysteria-1361446.html (last accessed 8 Dec 2014).

  5. 5.

    http://news.cnet.com/U.K.-banning-133-newsgroups/2100-1023_3-221558.html (last accessed 8 Dec 2014).

  6. 6.

    Ibid.

  7. 7.

    http://www.cyber-rights.org/reports/governan.htm (last accessed 7 Dec 2014). Akdeniz [1].

  8. 8.

    Idem. (“The UK police believed that the creators or possessors of pseudo-photographs would end up abusing children, so the purpose of the new legislation may be seen as to criminalise acts preparatory to abuse, and also to close possible future loopholes in the prosecution of such cases, as it may be very difficult to separate a pseudo-photograph from a real photograph.

    Although pseudo-photographs can be created without the involvement of real children, there is a justifiable fear that harm to children is associated with all child pornography. The Williams Committee stated:

    Few people would be prepared to take the risk Querywhere children are concerned and just as the law recognised that children should be protected against sexual behaviour which they are too young to properly consent to, it is almost universally agreed that this should apply to participation in pornography.’

    On the other hand, there are arguments that pseudo-photographs are not harmful. The children involved in child pornography may suffer physical or mental injury, but with pseudo-photographs, the situation is quite different. These photographs are created only by the use of computers. There is no involvement of children in production and there is no direct harm to children in their use. However there is substantial evidence that photographs of children engaged in sexual activity are used as tools for the further molestation of other children, and photographs or pseudo-photographs will be used interchangeably for this purpose.”).

  9. 9.

    http://www2.warwick.ac.uk/fac/soc/law/elj/jilt/1997_1/akdeniz2/ (last accessed 7 Dec 2014).

  10. 10.

    Ibid.

  11. 11.

    http://www.theguardian.com/technology/2000/apr/20/freespeech2 (last accessed 5 Dec 2014).

  12. 12.

    The Global Information Infrastructure (GII)—The Framework. http://clinton4.nara.gov/WH/New/Commerce/read.html (last accessed 8 Dec 2014).

  13. 13.

    http://web.mclink.it/MC8216/netmark/attach/bonn_en.htm#Heading01 (last accessed 7 Dec 2014).

  14. 14.

    http://www.theguardian.com/society/2006/mar/31/childrensservices.northernireland (last accessed 5 Dec 2014).

  15. 15.

    Microsoft originally released Internet Explorer 1.0 in August 1995.

  16. 16.

    Netscape Navigator 1.0 initial release was on 15 December 1994.

  17. 17.

    http://www.w3.org/PICS/ (last accessed 8 Dec 2014). The PICS specification enables labels (metadata) to be associated with Internet content. It was originally designed to help parents and teachers control what children access on the Internet, but it also facilitates other uses for labels, including code signing and privacy. The PICS platform is one on which other rating services and filtering software have been built. PICS has been superseded by the Protocol for Web Description Resources (POWDER).

  18. 18.

    http://www.justice.ie/en/JELR/Pages/Illegal_use_of_the_Internet_report Page 15 (last accessed Dec 14).

  19. 19.

    www.euroispa.org (last accessed 8 December 2014).

  20. 20.

    https://international.eco.de/about.html (last accessed 8 Dec 2014).

  21. 21.

    http://www.ispa.org.uk/about-us/ispas-industry-role/ (last accessed on 8 Dec 2014).

  22. 22.

    http://www.ispai.ie/about/ (last accessed 8 Dec 2014).

  23. 23.

    http://ec.europa.eu/justice_home/daphnetoolkit/html/projects/dpt_1998_045_c_en.html (last accessed 8 Dec 2014).

  24. 24.

    http://www.unesco.org/bpi/eng/unescopress/1999/99-219e.shtml (last accessed 9 Dec 2014).

  25. 25.

    http://www.cyber-rights.org/reports/interdev.htm (last accessed 9 Dec 2014).

  26. 26.

    http://europa.eu/legislation_summaries/information_society/Internet/l24190_en.htm (last accessed 8 Dec 2014).

  27. 27.

    Key priorities of the EU e-Skills strategy “e-Skills for the twenty-first century” COM(2007)496.

  28. 28.

    http://ec.europa.eu/justice/data-protection/article-29/documentation/other-document/files/2013/20131205_wp29_letter_to_cybercrime_committee.pdf (last accessed 10 Jan 2015).

  29. 29.

    Article 29 Directive 95/46/EC of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (OJ L 281, 23.11.1995, p. 31) sets up a Working Party on the Protection of Individuals with regard to the Processing of Personal Data. The “Article 29 Working Party” has advisory status and acts independently.

  30. 30.

    http://www.un.org/en/documents/udhr/.

  31. 31.

    http://www.echr.coe.int/Documents/Convention_ENG.pdf.

  32. 32.

    http://www.coe.int/t/dghl/cooperation/economiccrime/cybercrime/Documents/LEA_ISP/default_en.asp (last accessed 4 Jan 2015).

  33. 33.

    Chapter 1, Sect. 1.5.3.

  34. 34.

    http://www.interpol.int/Crime-areas/Crimes-against-children/Crimes-against-children (last accessed 4 Jan 2015).

  35. 35.

    http://www.missingkids.com/Exploitation/Industry (last accessed 4 Jan 2015).

  36. 36.

    http://www.microsoft.com/global/en-us/news/publishingimages/ImageGallery/Images/Infographics/PhotoDNA/flowchart_photodna_Web.jpg (last accessed 4 Jan 2015).

  37. 37.

    http://www.microsoft.com/en-us/news/presskits/photodna/docs/PhotoDNAFS.doc (last accessed 4 Jan 2015).

  38. 38.

    http://www.videntifier.com/news/article/advanced-video-and-image-analysis-in-interpol-child-abuse-database (last accessed 4 Jan 2015).

  39. 39.

    http://ec.europa.eu/information_society/apps/projects/factsheet/index.cfm?project_ref=SIP-2008-TP-131801 (last accessed 4 Jan 2015).

  40. 40.

    http://scc-sentinel.lancs.ac.uk/icop/ (last accessed 4 Jan 2015).

  41. 41.

    https://edri.org/human-rights-and-privatised-law-enforcement/ (last accessed 4 Jan 2015).

References

  1. Akdeniz Y (1997) Governance of pornography and child pornography on the global internet: a multi-layered approach. In: Edwards L, Waelde C (eds) Law and the internet: regulating cyberspace. Hart Publishing, Oxford, pp 223–241

    Google Scholar 

  2. European Commission (1996) Illegal and harmful content on the internet. Communication from the Commission to the Council, the European Parliament, the Economic and Social Committee and the Committee of the Regions. COM (96) 487 final, 16 October 1996

    Google Scholar 

  3. Akdeniz Y (1997) ‘Policing the Internet’, Conference Report, 1997 (1) The Journal of Information, Law and Technology (JILT). http://elj.warwick.ac.uk/jilt/bookrev/97_1pol/. New citation as at 1/1/04: http://www2.warwick.ac.uk/fac/soc/law/elj/jilt/1997_1/akdeniz2/

  4. Council of the European Union (1997) Resolution of the Council and of the Representatives of the Governments of the Member States, meeting within the Council of 17 February 1997 on illegal and harmful content on the Internet Official Journal C 070, 06/03/1997 P. 0001–0002

    Google Scholar 

  5. Clinton WJ, Gore Jr, A (1997) The global information infrastructure (GII)—The Framework. http://clinton4.nara.gov/WH/New/Commerce/read.html. Last accessed 8 Dec 2014

  6. European Union Ministers (1997) European Ministerial Conference entitled “Global Information Networks: Realising the Potential”, held in Bonn from 6–8 July 1997. Declaration available on http://web.mclink.it/MC8216/netmark/attach/bonn_en.htm#Heading01. Last accessed 7 Dec 2014

  7. Department of Justice, Equality and Law Reform, Ireland (1998) Working Group on Illegal and Harmful Use of the Internet, First Report of the Working Group. http://www.justice.ie/en/JELR/Pages/Illegal_use_of_the_Internet_report. Last accessed Dec 14

  8. European Parliament (2013), COM (2013) 48: Proposal for a Directive of the European Parliament and of the Council concerning measures to ensure a high common level of network and information security across the Union Available on http://eur-lex.europa.eu/procedure/EN/202368. Last accessed Dec 14

  9. UNESCO (1999) Experts Meeting on Sexual Abuse of Children, Child Pornography and Paedophilia non the Internet : an international challenge UNESCO, Paris Room II, Background Document. (CII-98/CONF. 6051 (E) http://unesdoc.unesco.org/images/0011/001147/114751Eo.pdf. Last accessed 9 Dec 2014

  10. European Parliament (1999) Decision No 276/1999/EC of the European Parliament and of the Council of 25 January 1999 adopting a multiannual Community action plan on promoting safer use of the Internet by combating illegal and harmful content on global networks. http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:31999D0276&from=EN. Last accessed 8 Dec 2014

  11. European Parliament (2003) DECISION No 1151/2003/EC of the European Parliament and of the Council of 16 June 2003 amending Decision No 276/1999/EC adopting a multiannual Community action plan on promoting safer use of the Internet by combating illegal and harmful content on global networks http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32003D1151&from=EN. Last accessed 8 Dec 2014

  12. European Commission (2003) Communication from the Commission to the Council, the European Parliament, the European Economic and Social Committee and the Committee of the Regions concerning the evaluation of the Multiannual Community Action Plan on promoting safer use of the Internet and new online technologies by combating illegal and harmful content primarily in the area of the protection of children and minors 03.11.2003 COM(2003) 653 final. Available on http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52003DC0653&from=EN. Last accessed 8 Dec 2014

  13. European Parliament (2005) DECISION No 854/2005/EC of the European Parliament and of the Council of 11 May 2005 establishing a multiannual Community Programme on promoting safer use of the Internet and new online technologies available at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32005D0854&from=EN. Last accessed 8 Dec 2014

  14. European Commission (2007) Communication from the Commission to the Council, the European Parliament, the European Economic and Social Committee and the Committee of the Regions COM/2007/0496 final—Available on http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52007DC0496. Last accessed 14 Dec 2014

  15. European Parliament (2000) Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce. Official Journal L 178, 17/07/2000 P. 0001–0016. Available on http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000L0031:en:HTML. Last accessed 14 Dec 2014

  16. Council of Europe (2001) Convention on Cybercrime signed in Budapest on 23.XI.2001 (ETS No. 185) Available at http://conventions.coe.int/Treaty/EN/Treaties/Html/185.htm. Last accessed 14 Dec 2014

  17. Bertelsmann Foundation (1999) Self-regulation of Internet Content by Dr, Marcel Machill and Jens Waltermann

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cormac Callanan .

Appendices

Annex—Technology Options: Internet Monitoring and Blocking

Technical solutions for monitoring and blocking have evolved significantly since the beginning of the Internet. There are two main areas of interest including the technical methods of monitoring and blocking and how to specify which content/user is to be monitored/identified.

A comprehensive approach to monitoring requires both the technology itself (hardware, software and interconnections) and ways to specify/identify the specific content to be blocked or monitored. The nature of the content—image, text, video, voice, etc.—first needs to be determined, and it must be decided whether it is required to log records of traffic data (identities of communicating parties) or record copies of communications (message content) or disrupt the communications flow by either preventing the successful completion of the selected communications or by changing the nature of the communications exchange.

Whereas there are complex methods of protecting traffic from eavesdropping or interception using modern tools for encryption, until recently as a result of the alleged widespread monitoring of Internet traffic directly from the fibre-optic cables and major Internet hubs, few Internet companies have adopted encryption for traffic exchange. It is rare that email is encrypted in transit or that data are stored in encrypted format with no access by the hosting provider. In any case, if content encryption is implemented, then because of the design of the Internet, it is difficult to obfuscate Internet traffic records with the intention to bypass blocking or monitoring except with the use of TOR networking (The Onion Router). TOR creates complex Internet traffic paths which then obfuscate the unique patterns of any single traffic pattern which are of interest. Unfettered access to primary international communication cables can support the possibility of decoding such obfuscated patterns.

2.1.1 Child Abuse Material

Let us review the challenges of detecting child abuse material with specific focus on images rather than text, voice or video. Such material must first be created, then distributed on the Internet, downloaded by interested parties and then stored and viewed. Each of these steps creates both online and offline digital fingerprints and records.

The camera, which records an illegal image, also records additional meta-data about the image which describes the technical photographic context when the image was recorded (light levels, indoor/outdoor, lens aperture and duration of exposure), and modern cameras can record GPS, camera model, camera serial number or camera configuration at the time the image was taken. Recent advances in camera sensors (which offer very high resolutions) permit the identification of a specific camera by calculating the almost unique “dna” of each camera where minor differences and variations on image processing capabilities can permit the identification of a camera by analysing publicly displayed photographs with non-public illegal images. The image of a modern digital camera is stored in a memory storage device which can also be forensically analysed for digital fingerprints. The image storage structure is different for different brands of cameras, and direct access to the storage media can yield many useful clues for a criminal investigation.

Once the image is created, the image will be distributed on the Internet. It might be first edited, changed or obfuscated using software or hardware to destroy, corrupt or delete digital fingerprints and, in extreme cases, this can be successful. However, even in these difficult cases, the computer where this work was conducted will contain forensic records of activities—software installed and used, image files accessed, files edited, use of Internet programmes and often detailed timeline of activities which occurred on the computer. Forensic analysis can detect image edits and updates (morphed images). The records of Internet access to upload the content onto the Internet creates a complex web of digital artefacts on the device used to upload the data from and also creates a complex range of log records on the local and intermediate Internet service providers.

The image can be accessed, viewed and downloaded from persons located across the world. There will be digital records and fingerprints on the intermediary service providers and the end-user Internet service provider and on the local network and the local device used to initiate the Internet activity.

The image can be stored in many different formats supplemented by compression, encryption or hiding techniques. It can be stored in many different types of media and with direct, indirect or no network access. It can be stored in cameras, tv recorders, game players, children’s toys, health monitoring devices, etc. in addition to a wide variety of computing and mobile devices. Stored data do not have to be static and can be transferred across international and jurisdictional boundaries on random or regulated schedules or can be divided across many different international storage locations.

Of course, if it is possible to access computing devices remotely or if it is possible to have unrestricted access to network cables at a national and international level, then there are serious concerns about the level of trust which can be placed on digital evidence since with enough knowledge of computing systems and application structures available to digital forensic scientists, it would be possible to surreptitiously place believable difficult-to-refute evidence on any system and ensure the supporting traffic logs are recorded on intermediary systems.

If we study the actual images that are considered the image of a crime scene where the abuse of a child takes place, we need to work to determine several critical pieces of information from the image analysis. We need to identify the location where the image was taken. We need to identify the victim or victims in the photograph so that the victim can be rescued from a damaging situation and provided with treatment and support. We need to identify the perpetrators who committed the crime and the identity of the photographer if different. The image can be analysed to identify any known product brands in the image which can identify the continent, country and region or even shop where the product was sold and bought. For example, there was one image where the boxes of computers previously purchased were recorded in the image which almost offered a detailed view of a barcode of the purchaser account details. Another image when abuse occurred in a child’s playground was identifiable from the type, range, colour and layout of the playground equipment. The image can contain artefacts of telephones, power sockets, newspapers, magazines, etc. which can be used to narrow down the search to a location for more traditional methods of investigation. The clothes worn by the victims and the shadows of the perpetrator can offer clues to the investigation.

These techniques described in the previous paragraphs apply to static text images, but there are evolving techniques that offer similar levels of forensic analysis for other types of content such as video, voice or text material.

2.1.2 Specifying Content

To specify which content is illegal or not illegal, there should be a trained and skilled expert making the evaluation of the alleged illegal content against a specific legislative provision. In the area of child pornography (the usual legal term for child abuse material), the major primary definitions are in the Council of Europe Conventions on Cybercrime and Protection of Children against Sexual Exploitation and Sexual Abuse and the European Commission Directive on Certain legal aspects of information society services, in particular electronic commerce and Directive 2011/93/EU of the European Parliament and of the Council on combating sexual abuse and sexual exploitation of children and child pornography. These documents are then supported by national law which defines the exact nature of child pornography and child abuse. Images, text, videos, voice and illegal activities are clearly defined, and any content on the Internet which is suspected of being child abuse material needs to be evaluated according to these standards. In an ideal world, content would be evaluated in a clearly specified judicial process, but due to the speed of movement on the Internet, such procedures, although still used for the prosecution of individuals, were found to be inadequate for content on the Internet. Often the material is initially assessed by the Internet hotlines created across the world which pass suspect material to law enforcement who make the determination whether material is likely to be illegal.

Today InterpolFootnote 33 and US NCMECFootnote 34 (National Centre for Missing and Exploited Children) host a list of the worst images of child abuse which have been discovered and share this list of images with law enforcement agencies across the world. The national law enforcement agencies then share this list with trusted Internet service providers to enable them to detect the upload or download of such images from their networks.

When material is deemed to be illegal, attempts have been made to ensure that this material is not further re-distributed on the Internet using blocking technologies. The list of known illegal images is also used to make legal search and seizure and forensic analysis of computer networks, hosts and storage media faster by using the list to compare against currently hosted and stored files to determine if any known illegal images are among the seized or monitored material.

The creation of the list is an important role that requires oversight and accountability since it would be possible to include any type of material on the list and there have been accusations in the past that list maintained have included material on the list which is not illegal but might be considered to be inappropriate by conservatively minded individuals.

The list itself is a special technical challenge. In fact, there are several lists that exist for different Internet services.

There is a list of known keywords used by paedophiles to search for child abuse material on search engines. This list is distributed to search engine providers to prevent users from using their services to access illegal content.

There is a list of known websites which have regularly hosted illegal material. This list can be used both by search engine providers and by hosting providers and blacklisting inappropriate sites that are blocked by host-based anti-malware software. This list can include a list of IP addresses or domain names.

The third list is a list of known illegal images that have been detected during investigations by law enforcement across the world. It would not be legal or indeed morally appropriate to distribute the exact images of child abuse. Instead, alternative methods of specifying content are required. Such methods need to be precise and accurate so that they uniquely specify an image of known child abuse and cannot accidently refer to other legal content. The method is known as digital image fingerprinting. A fingerprint is a computer algorithm which calculates a one-way numerical code which uniquely identifies each specific image. The fingerprint unique code is compared against the calculated code for a suspect image. If the generated code matches one that is on the list of illegal content, then we can be sure that the images are identical.

One problem with the approach of seeking identical matches between images is that minor changes in the suspect image (cropping, resizing, re-colouring, etc.) means that the image fingerprints will be different from the ones on the list of known illegal content. New approaches have been adopted to deal with such situations. One such technique is called PhotoDNAFootnote 35 developed by Microsoft Research in collaboration with Dartmouth College. According to MicrosoftFootnote 36PhotoDNA enables analysts to calculate the unique digital signature of an image and then match that signature against those of other photos. The PhotoDNA ‘robust hashing’ technique differs from other common hashing technologies because it does not require the image’s characteristics to be completely identical to reliably find matches, thereby enabling matches to be identified even when photos are resized or similarly altered”. Of course, when images are not directly matched using specific matching, any matches suggested by PhotoDNA software need to be verified by human eye for further criminal investigation.

Microsoft, Facebook and other international service providers now use PhotoDNA with lists provided by Interpol or NCMEC to compare all images uploaded to their services against the known list of illegal content. Any content that matches is forwarded to law enforcement for investigation.

As regards video content, there are separate techniques developed by Videntifier from IcelandFootnote 37 in collaboration with Interpol and others to create methods of uniquely identifying videos.

The major limitation of all these approaches is that they can only report on known content. They cannot easily detect new material that has not been seen before or investigated by law enforcement. However, there are initiatives to detect content which has not been seen before through complex image analysis with attempts to detect images based on skin tone, title and sharp analysis. Further details are available in the FIVES: Forensic Image and Video Examination SupportFootnote 38 project description and iCop: Identifying and Catching Originators in P2P Networks.Footnote 39

Whereas all these examples have used the area of child abuse as an example, many states are concerned about a diverse range of online content and activity. Many states are also concerned about extreme adult content, many are concerned about online incitement to terrorism, and many are concerned about illegal online drugs or gambling.

There are concerns that service providers are being drawn into areas of behaviour which are not sufficiently transparent or accountable and have a direct impact on human rights and rights to family life and rights to freedom of expression. In some countries, criticising the state is a criminal offence. The EDRI document human rights and privatised law enforcement “looks at the extent to which ‘voluntary’ law enforcement measures by online companies are serving to undermine long-established fundamental rights principles and much of the democratic value of the Internet”.Footnote 40

2.1.3 Different Services

Not all Internet services are the same. It has been outlined earlier the complex range of services. Each service has different capabilities for oversight, monitoring and detecting criminal activity. Email is different from peer-to-peer, and social network websites are very different from search engine providers. Whereas many of these services play a role in combating illegal online activity and content, there are also many methods to evade detection and circumvent blocking initiatives.

Appendix I

List of Newsgroups from Metropolitan Police 26 August 1996 requested to be blocked.

  • alt.binaries.pictures.boys

  • alt.binaries.pictures.child.erotica.female

  • alt.binaries.pictures.child.erotica.male

  • alt.binaries.pictures.children

  • alt.binaries.pictures.erotic.children

  • alt.binaries.pictures.erotica child

  • alt.binaries.pictures.erotica.child.female

  • alt.binaries.pictures.erotica.child.male

  • alt.binaries.pictures.erotica.children

  • alt.binaries.pictures.erotica.lolita

  • alt.binaries.pictures.erotica.pre-teen

  • alt.binaries.pictures.erotica.teen.fuck

  • alt.binaries.pictures.erotica.young

  • alt.binaries.pictures.lolita.fucking

  • alt.binaries.pictures.lolita.misc

  • alt.sex.boys

  • alt.sex.children

  • alt.sex.fetish.tinygirls

  • alt.sex.girls

  • alt.sex.incest

  • alt.sex.intergen

  • alt.sex.pedophile.mike-labbe

  • alt.sex.pedophilia.

  • alt.sex.pedophilia.boys

  • alt.sex.pedophilia.girls

  • alt.sex.pedophilia.swaps

  • alt.sex.pedophilia.pictures

  • alt.sex.pre-teens

  • alt.sex.teens

  • alt.sex.weight-gain

  • alt.fan.cock-sucking

  • alt.binaries.pictures.voyeurism

  • alt.binaries.pictures.lolita.fucking

  • alt.binaries.pictures.erotica.voyeurism

  • alt.binaries.pictures.erotica.young

  • alt.binaries.pictures.erotica.uniform

  • alt.binaries.pictures.erotica.urine

  • alt.binaries.pictures.erotica.teen.fuck

  • alt.binaries.pictures.erotica.uncut

  • alt.binaries.pictures.erotica.spanking

  • alt.binaries.pictures.erotica.teen.female.masturbation

  • alt.binaries.pictures.erotica.pornstars

  • alt.binaries.pictures.erotica.pre-teen

  • alt.binaries.pictures.erotica.oral

  • alt.binaries.fetish.scat

  • alt.binaries.pictures.erotic.anime

  • alt.binaries.pictures.erotic.centerfolds

  • alt.binaries.pictures.erotic.senior-citizens

  • alt.binaries.pictures.erotica.animals

  • alt.binaries.pictures.erotica.art.pin-up

  • alt.binaries.pictures.erotica.breasts.small

  • alt.binaries.pictures.erotica.butts

  • alt.binaries.pictures.erotica.cheerleaders

  • alt.binaries.pictures.erotica.disney

  • alt.binaries.pictures.erotica.fetish.feet

  • alt.binaries.pictures.erotica.fetish.hair

  • alt.binaries.pictures.erotic.senior-citizens

  • alt.binaries.pictures.erotica.teen

  • alt.binaries.pictures.erotica.male.anal

  • alt.sex.pedophile.mike-labbe

  • alt.sex.masturbation

  • alt.sex.fetish.tickling

  • alt.sex.fetish.waifs

  • alt.sex.fetish.watersports

  • alt.sex.fetish.wrestling

  • alt.sex.first-time

  • alt.sex.fetish.girl.watchers

  • alt.sex.homosexual

  • alt.sex.incest

  • alt.sex.intergen

  • alt.sex.jp

  • alt.sex.magazines

  • alt.sex.masturbation

  • alt.sex.movies

  • alt.sex.necrophilia

  • alt.sex.pedophilia

  • alt.sex.pictures

  • alt.sex.pictures.female

  • alt.sex.pictures.male

  • alt.sex.services

  • alt.sex.spam

  • alt.sex.spanking

  • alt.sex.stories

  • alt.sex.strip-clubs

  • alt.magazines.pornographic

  • alt.magick.sex

  • alt.personals.spanking.punishment

  • alt.sex.

  • alt.sex.anal

  • alt.sex.bestiality

  • alt.sex.bondage

  • alt.sex.breast

  • alt.sex.enemas

  • alt.sex.exhibitionism

  • alt.sex.fat

  • alt.sex.fetish.diapers

  • alt.sex.fetish.fa

  • alt.sex.fetish.feet

  • alt.sex.fetish.hair

  • alt.sex.fetish.orientals

  • alt.binaries.multimedia.erotica

  • alt.binaries.pictures.boys

  • alt.binaries.pictures.children

  • alt.binaries.pictures.erotica

  • alt.binaries.pictures.erotica.amateur.d

  • alt.binaries.pictures.amateur.female

  • alt.binaries.pictures.amateur.male

  • alt.binaries.pictures.erotica.anime

  • alt.binaries.pictures.erotica.bestiality

  • alt.binaries.pictures.erotica.blondes

  • alt.binaries.pictures.erotica.bondage

  • alt.binaries.pictures.erotica.cartoons

  • alt.binaries.pictures.erotica.female

  • alt.binaries.pictures.erotica.furry

  • alt.binaries.pictures.erotica.gaymen

  • alt.binaries.pictures.erotica.male

  • alt.binaries.pictures.erotica.orientals

  • alt.binaries.pictures.erotica.pregnant

  • alt.binaries.pictures.erotica.teen

  • alt.binaries.pictures.erotica.teen.d

  • alt.binaries.pictures.girlfriend

  • alt.binaries.pictures.girlfriends

  • alt.binaries.pictures.girl

  • alt.binaries.pictures.horny.nurses

  • alt.binaries.pictures.pictures.nudism

  • alt.binaries.pictures.tasteless

  • alt.homosexual

  • alt.sex.swingers

  • alt.sex.telephone

  • alt.sex.trans

  • alt.sex.wanted

  • alt.sex.watersports

  • alt.sex.bestiality.pictures

  • alt.sex.children

  • alt.sex.cu-seeme

  • alt.sex.fetish.scat

  • alt.sex.fetish.tinygirls

  • alt.sex.fetish.wet-and-messy

  • alt.sex.oral

  • alt.sex.orgy

  • alt.sex.pedophilia.girls

  • alt.sex.pedophilia.pictures

  • alt.sex.pictures.d

  • alt.sex.stories.gay

  • alt.sex.stories.tg

  • alt.sex.super-size

  • alt.sex.tasteless

  • alt.sex.teens

  • alt.sex.video-swap

  • alt.binaries.pictures.erotica.black.male

  • alt.binaries.pictures.erotica.children

  • alt.sex.sm.fig

Rights and permissions

Reprints and permissions

Copyright information

© 2015 The Author(s)

About this chapter

Cite this chapter

Callanan, C. (2015). Evolution, Implementation and Practice of Internet Self-regulation, Co-regulation and Public–Private Collaboration. In: Self- and Co-regulation in Cybercrime, Cybersecurity and National Security. SpringerBriefs in Cybersecurity. Springer, Cham. https://doi.org/10.1007/978-3-319-16447-2_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-16447-2_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-16446-5

  • Online ISBN: 978-3-319-16447-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics