Advertisement

Variational Deep Semantic Text Hashing with Pairwise Labels

  • Richeng Xuan
  • Junho ShimEmail author
  • Sang-goo Lee
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 935)

Abstract

With the rapid growth of the Web, the amount of textual data has increased explosively over the past few years. Fast similarity searches for text are becoming an essential requirement in many applications. Semantic hashing is one of the most powerful solutions for fast similarity searches. Semantic hashing has been widely deployed to approximate large-scale similarity searches. We can represent original text data using compact binary codes through hashing. Recent advances in neural network architecture have demonstrated the effectiveness and capability of this method to learn better hash functions. Most encode explicit features, such as categorical labels. Due to the special nature of textual data, previous semantic text hashing approaches do not utilize pairwise label information. However, pairwise label information reflects the similarity more intuitively than categorical label data. In this paper, we propose a supervised semantic text hashing method that utilizes pairwise label information. Experimental results on three public datasets show that our method can exploit pairwise label information well enough to outperform previous state-of-the-art hashing approaches.

Keywords

Natural language processing Semantic hashing Machine learning Similarity search 

Notes

Acknowledgments

This research was supported in part by the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning through the Basic Science Research Program under Grant 2017R1E1A1A03070004. It was also supported in part by the Ministry of Science, ICT and Future Planning through the ITRC (Information Technology Research Center) support program (IITP-2017-2015-0-00378) supervised by the IITP (Institute for Information and communications Technology Promotion).

References

  1. 1.
    Andoni, A., Indyk, P.: Near-optimal hashing algorithms for approximate nearest neighbor in high dimensions. In: 47th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2006, pp. 459–468. IEEE (2006)Google Scholar
  2. 2.
    Bowman, S.R., Vilnis, L., Vinyals, O., Dai, A., Jozefowicz, R., Bengio, S.: Generating sentences from a continuous space. In: Proceedings of The 20th SIGNLL Conference on Computational Natural Language Learning, pp. 10–21 (2016)Google Scholar
  3. 3.
    Chaidaroon, S., Fang, Y.: Variational deep semantic hashing for text documents. In: Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 75–84. ACM (2017)Google Scholar
  4. 4.
    Datar, M., Immorlica, N., Indyk, P., Mirrokni, V.S.: Locality-sensitive hashing scheme based on p-stable distributions. In: Proceedings of the Twentieth Annual Symposium on Computational Geometry, pp. 253–262. ACM (2004)Google Scholar
  5. 5.
    Dolan, B., Quirk, C., Brockett, C.: Unsupervised construction of large paraphrase corpora: exploiting massively parallel news sources. In: Proceedings of the 20th international conference on Computational Linguistics, p. 350. Association for Computational Linguistics (2004)Google Scholar
  6. 6.
    Hu, B., Lu, Z., Li, H., Chen, Q.: Convolutional neural network architectures for matching natural language sentences. In: Advances in Neural Information Processing Systems, pp. 2042–2050 (2014)Google Scholar
  7. 7.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: ICLR 2015 (2015)Google Scholar
  8. 8.
    Kingma, D.P., Welling, M.: Auto-encoding variational Bayes. In: ICLR 2014 (2014)Google Scholar
  9. 9.
    Lai, H., Pan, Y., Liu, Y., Yan, S.: Simultaneous feature learning and hash coding with deep neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3270–3278 (2015)Google Scholar
  10. 10.
    Le, Q., Mikolov, T.: Distributed representations of sentences and documents. In: International Conference on Machine Learning, pp. 1188–1196 (2014)Google Scholar
  11. 11.
    Li, W.-J., Wang, S., Kang, W.-C.: Feature learning based deep supervised hashing with pairwise labels. arXiv preprint arXiv:1511.03855 (2015)
  12. 12.
    Miao, Y., Yu, L., Blunsom, P.: Neural variational inference for text processing. In: International Conference on Machine Learning, pp. 1727–1736 (2016)Google Scholar
  13. 13.
    Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119 (2013)Google Scholar
  14. 14.
    Norouzi, M., Fleet, D.J., Salakhutdinov, R.R.: Hamming distance metric learning. In: Advances in Neural Information Processing Systems, pp. 1061–1069 (2012)Google Scholar
  15. 15.
    Pennington, J., Socher, R., Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)Google Scholar
  16. 16.
    Salakhutdinov, R., Hinton, G.: Semantic hashing. Int. J. Approximate Reasoning 50(7), 969–978 (2009)CrossRefGoogle Scholar
  17. 17.
    Schütze, H., Manning, C.D., Raghavan, P.: Introduction to Information Retrieval, vol. 39. Cambridge University Press, Cambridge (2008)zbMATHGoogle Scholar
  18. 18.
    Shen, D., Su, Q., Chapfuwa, P., Wang, W., Wang, G., Carin, L., Henao, R.: Nash: toward end-to-end neural architecture for generative semantic hashing. arXiv preprint arXiv:1805.05361 (2018)
  19. 19.
    Shen, D., Zhang, Y., Henao, R., Su, Q., Carin, L.: Deconvolutional latent-variable model for text sequence matching. arXiv preprint arXiv:1709.07109 (2017)
  20. 20.
    Smolensky, P.: Information processing in dynamical systems: foundations of harmony theory. Technical report, Colorado Univ at Boulder Dept of Computer Science (1986)Google Scholar
  21. 21.
    Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)MathSciNetzbMATHGoogle Scholar
  22. 22.
    Wainwright, M.J., Jordan, M.I., et al.: Graphical models, exponential families, and variational inference. In: Foundations and Trends® in Machine Learning , vol. 1, pp. 1–2 (2008). 1–305CrossRefGoogle Scholar
  23. 23.
    Wang, Z., Hamza, W., Florian, R.: Bilateral multi-perspective matching for natural language sentences. arXiv preprint arXiv:1702.03814 (2017)
  24. 24.
    Weiss, Y., Torralba, A., Fergus, R.: Spectral hashing. In: Advances in Neural Information Processing Systems, pp. 1753–1760 (2009)Google Scholar
  25. 25.
    Xu, J., Wang, P., Tian, G., Xu, B., Zhao, J., Wang, F., Hao, H.: Convolutional neural networks for text hashing. In: IJCAI 2015, pp. 1369–1375 (2015)Google Scholar
  26. 26.
    Yan, X., Yang, J., Sohn, K., Lee, H.: Attribute2Image: conditional image generation from visual attributes. In: European Conference on Computer Vision, pp. 776–791. Springer (2016)Google Scholar
  27. 27.
    Yang, Y., Yih, W.-t., Meek, C.: WikiQA: a challenge dataset for open-domain question answering. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, pp. 2013–2018 (2015)Google Scholar
  28. 28.
    Yang, Z., Hu, Z., Salakhutdinov, R., Berg-Kirkpatrick, T.: Improved variational autoencoders for text modeling using dilated convolutions. In: International Conference on Machine Learning, pp. 3881–3890 (2017)Google Scholar
  29. 29.
    Yin, W., Schütze, H., Xiang, B., Zhou, B.: ABCNN: attention-based convolutional neural network for modeling sentence pairs. arXiv preprint arXiv:1512.05193 (2015)
  30. 30.
    Youn, J., Shim, J., Lee, S.-G.: Efficient data stream clustering with sliding windows based on locality-sensitive hashing. IEEE Access 6(1), 63757–63776 (2018)CrossRefGoogle Scholar
  31. 31.
    Zhu, H., Long, M., Wang, J., Cao, Y.: Deep hashing network for efficient similarity retrieval. In: AAAI 2016, pp. 2415–2421 (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Computer Science and EngineeringSeoul National UniversitySeoulRepublic of Korea
  2. 2.Department of Computer ScienceSookmyung Womens UniversitySeoulRepublic of Korea

Personalised recommendations