Advertisement

MRNet-Product2Vec: A Multi-task Recurrent Neural Network for Product Embeddings

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10536)

Abstract

E-commerce websites such as Amazon, Alibaba, Flipkart, and Walmart sell billions of products. Machine learning (ML) algorithms involving products are often used to improve the customer experience and increase revenue, e.g., product similarity, recommendation, and price estimation. The products are required to be represented as features before training an ML algorithm. In this paper, we propose an approach called MRNet-Product2Vec for creating generic embeddings of products within an e-commerce ecosystem. We learn a dense and low-dimensional embedding where a diverse set of signals related to a product are explicitly injected into its representation. We train a Discriminative Multi-task Bidirectional Recurrent Neural Network (RNN), where the input is a product title fed through a Bidirectional RNN and at the output, product labels corresponding to fifteen different tasks are predicted. The task set includes several intrinsic characteristics about a product such as price, weight, size, color, popularity, and material. We evaluate the proposed embedding quantitatively and qualitatively. We demonstrate that they are almost as good as sparse and extremely high-dimensional TF-IDF representation in spite of having less than 3% of the TF-IDF dimension. We also use a multimodal autoencoder for comparing products from different language-regions and show preliminary yet promising qualitative results.

References

  1. 1.
    Caruana, R.: Multitask learning. In: Thrun, S., Pratt, L. (eds.) Learning to Learn. Springer, Boston (1998).  https://doi.org/10.1007/978-1-4615-5529-2_5 Google Scholar
  2. 2.
    Grbovic, M., Radosavljevic, V., Djuric, N., Bhamidipati, N., Savla, J., Bhagwan, V., Sharp, D.: E-commerce in your inbox: product recommendations at scale. In: Proceedings of the 21th ACM SIGKDD. ACM (2015)Google Scholar
  3. 3.
    Hochreiter, S.: The vanishing gradient problem during learning recurrent neural nets and problem solutions. Int. J. Uncert. Fuzz. Knowl.-Based Syst. 6(02), 107–116 (1998)CrossRefMATHGoogle Scholar
  4. 4.
    Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)CrossRefGoogle Scholar
  5. 5.
    Keogh, E., Mueen, A.: Curse of dimensionality. In: Sammut, C., Webb, G.I. (eds.) Encyclopedia of Machine Learning, pp. 257–258. Springer, New York (2011).  https://doi.org/10.1007/978-0-387-30164-8 Google Scholar
  6. 6.
    Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: NIPS (2013)Google Scholar
  7. 7.
    Ng, A.: Sparse autoencoder. CS294A Lecture Notes 72, 1–19 (2011)Google Scholar
  8. 8.
    Ngiam, J., Khosla, A., Kim, M., Nam, J., Lee, H., Ng, A.Y.: Multimodal deep learning. In: ICML (2011)Google Scholar
  9. 9.
    Vasile, F., Smirnova, E., Conneau, A.: Meta-prod2vec: product embeddings using side-information for recommendation. In: Proceedings of the 10th ACM Conference on Recommender Systems. ACM (2016)Google Scholar
  10. 10.
    Werbos, P.J.: Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Core Machine LearningAmazonBangaloreIndia

Personalised recommendations