Reliable Evaluations of URL Normalization
URL normalization is a process of transforming URL strings into canonical form. Through this process, duplicate URL representations for web pages can be reduced significantly. There are a number of normalization methods. In this paper, we describe four metrics for evaluating normalization methods. The reliability and consistency of a URL is also considered in our evaluation. With the metrics proposed, we evaluate seven normalization methods. The evaluation results on over 25 million URLs, extracted from the web, are reported in this paper.
KeywordsNormalization Method True Positive Rate Path Segment Uniform Resource Locator Path Component
Unable to display preview. Download preview PDF.
- 1.Berners-Lee, T., Fielding, R., Masinter, L.: Uniform Resource Identifiers (URI): Generic Syntax (2005), http://gbiv.com/protocols/uri/rfc/rfc2396.html
- 2.Burner, M.: Crawling Towards Eternity: Building an Archive of the World Wide Web. Web Techniques Magazine 2(5), 37–40 (1997)Google Scholar
- 3.Heydon, A., Najork, M.: Mercator: A Scalable, Extensible Web Crawler. International Journal of WWW 2(4), 219–229 (1999)Google Scholar
- 7.Shkapenyuk, V., Suel, T.: Design and Implementation of a High-performance Distributed Web Crawler. In: Proceedings of 18th Data Engineering Conference, pp. 357–368 (2002)Google Scholar
- 8.Netcraft: Web Server Survey (2004), http://news.netcraft.com/archives/web_server_survey.html