Abstract
The evaluation of search results for improving the precious rank of search engines is a challenge. This paper proposes a new method of search results evaluation based on user behavior. The method includes the information extraction technology, the calculation of weight and the evaluation of results. It enhances the accuracy of corresponding answer annotation. The experimental results show that the method achieves a more precious search results rank than the way using click-through data only.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Kent, A., Berry, M., Leuhrs, F.U., Perry, J.W.: Machine literature searching VIII. Operational criteria for designing information retrieval systems. American Documentation 6(2), 93–101 (1955)
Oard, D.W., Kim, J.: Modeling information content using observable behavior. In: Proc. of ASIST 2001, Washington, D.C., USA, pp. 38–45 (2001)
Joachims, T., Granka, L., Pan, B., Hembrooke, H., Gay, G.: Accurately interpreting clickthrough data as implicit feedback. In: SIGIR 2005, pp. 154–161. ACM, New York (2005)
Sharma, H.: Automated Evaluation of Search Engine Performance via Implicit User Feedback. In: SIGIR 2005, August 15-19 (2005)
Dou, Z., Song, R., Yuan, X., Wen, J.R.: Are click-through data adequate for learning web search rankings? In: CIKM 2008, New York, pp. 73–82 (2008)
Cen, R., Liu, Y., Zhang, M., Ru, L., Ma, S.: Automatic Search Engine Performance Evaluation with the Wisdom of Crowds. In: Lee, G.G., Song, D., Lin, C.-Y., Aizawa, A., Kuriyama, K., Yoshioka, M., Sakai, T. (eds.) AIRS 2009. LNCS, vol. 5839, pp. 351–362. Springer, Heidelberg (2009)
Goutam, R.K., Dwivedi, S.K.: Search Engines Evaluation Using Users Efforts. In: International Conference on Computer & Communication Technology, ICCCT (2011)
Joachims, T.: Large-Scale Validation and Analysis of Interleaved Search Evaluation. ACM Transactions on Information Systems 30(1), Article 6 (February 2012)
Broder, A.: A taxonomy of web search. SIGIR Forum 36(2), 3–10 (2002)
Rose, D.E., Levinson, D.: Understanding user goals in web search. In: Proc. of WWW 2004, pp. 13–19. ACM, New York (2004)
Liu, Y., Zhang, M., Ru, L., Ma, S.: Automatic Query Type Identification Based on Click Through Information. In: Ng, H.T., Leong, M.-K., Kan, M.-Y., Ji, D. (eds.) AIRS 2006. LNCS, vol. 4182, pp. 593–600. Springer, Heidelberg (2006)
Hassan, A., Song, Y., He, L.: A Task Level Metric for Measuring Web Search Satisfaction and its Application on Improving Relevance Estimation. In: CIKM 2011, Glasgow, Scotland, UK, October 24-28 (2011)
Ali, R., Sufyan Beg, M.M.: An overview of Web search evaluation methods. Computers and Electrical Engineering 37, 835–848 (2011)
Cen, R., Liu, Y., Zhang, M., Ru, L., Ma, S.: Automatic Search Engine Performance Evaluation with the Wisdom of Crowds. In: Lee, G.G., Song, D., Lin, C.-Y., Aizawa, A., Kuriyama, K., Yoshioka, M., Sakai, T. (eds.) AIRS 2009. LNCS, vol. 5839, pp. 351–362. Springer, Heidelberg (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yu, J., Lu, Y., Sun, S., Zhang, F. (2013). Search Results Evaluation Based on User Behavior. In: Yuan, Y., Wu, X., Lu, Y. (eds) Trustworthy Computing and Services. ISCTCS 2012. Communications in Computer and Information Science, vol 320. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35795-4_50
Download citation
DOI: https://doi.org/10.1007/978-3-642-35795-4_50
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-35794-7
Online ISBN: 978-3-642-35795-4
eBook Packages: Computer ScienceComputer Science (R0)