Abstract
Determinantal point processes (DPPs) are well known models for diverse subset selection problems, including recommendation tasks, document summarization and image search. In this paper, we discuss a greedy deterministic adaptation of DPPs. Deterministic algorithms are interesting for many applications, as they provide interpretability to the user by having no failure probability and always returning the same results. First, the ability of the method to yield low-rank approximations of kernel matrices is evaluated by comparing the accuracy of the Nyström approximation on multiple datasets. Afterwards, we demonstrate the usefulness of the model on an image search task.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
We used the Matlab code available at https://www.alexkulesza.com/.
- 3.
- 4.
References
Alaoui, A., Mahoney, M.W.: Fast randomized kernel ridge regression with statistical guarantees. In: Advances in Neural Information Processing Systems, pp. 775–783 (2015)
Borodin, A.: Determinantal point processes. arXiv preprint arXiv:0911.1153 (2009)
Carbonell, J.G., Goldstein, J.: The use of MMR, diversity-based reranking for reordering documents and producing summaries. SIGIR 98, 335–336 (1998)
Chen, L., Zhang, G., Zhou, E.: Fast greedy map inference for determinantal point process to improve recommendation diversity. In: Advances in Neural Information Processing Systems, pp. 5622–5633 (2018)
Çivril, A., Magdon-Ismail, M.: On selecting a maximum volume sub-matrix of a matrix and related problems. Theor. Comput. Sci. 410(47–49), 4801–4811 (2009)
DeVore, R., Petrova, G., Wojtaszczyk, P.: Greedy algorithms for reduced bases in banach spaces. Construct. Approx. 37(3), 455–466 (2013)
Drineas, P., Magdon-Ismail, M., Mahoney, M.W., Woodruff, D.P.: Fast approximation of matrix coherence and statistical leverage. J. Mach. Learn. Res. 13, 3475–3506 (2012)
Fanuel, M., Schreurs, J., Suykens, J.A.K.: Nyström landmark sampling and regularized Christoffel functions. arXiv preprint arXiv:1905.12346 (2019)
Gillenwater, J., Kulesza, A., Taskar, B.: Near-optimal map inference for determinantal point processes. In: Advances in Neural Information Processing Systems, pp. 2735–2743 (2012)
Gong, B., Chao, W.L., Grauman, K., Sha, F.: Diverse sequential subset selection for supervised video summarization. In: Advances in Neural Information Processing Systems, pp. 2069–2077 (2014)
Kulesza, A., Taskar, B.: k-DPPs: fixed-size determinantal point processes. In: Proceedings of the 28th International Conference on Machine Learning, pp. 1193–1200 (2011)
Kulesza, A., Taskar, B.: Determinantal point processes for machine learning. Found. Trends Mach. Learn. 5(2–3), 123–286 (2012)
Kulesza, A., Taskar, B.: Structured determinantal point processes. In: Advances in Neural Information Processing Systems, pp. 1171–1179 (2010)
Kulesza, A., Taskar, B.: Learning determinantal point processes. arXiv preprint arXiv:1202.3738 (2012)
Li, C., Jegelka, S., Sra, S.: Fast DPP sampling for Nyström with application to kernel methods. In: Proceedings of the 33rd International Conference on International Conference on Machine Learning, pp. 2061–2070 (2016)
Lowe, D.G., et al.: Object recognition from local scale-invariant features. In: ICCV, vol. 99, pp. 1150–1157 (1999)
McCurdy, S.: Ridge regression and provable deterministic ridge leverage score sampling. In: Advances in Neural Information Processing Systems, vol. 31, pp. 2468–2477 (2018)
Papailiopoulos, D., Kyrillidis, A., Boutsidis, C.: Provable deterministic leverage score sampling. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 997–1006 (2014)
Schölkopf, B., Smola, A., Müller, K.-R.: Kernel principal component analysis. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 583–588. Springer, Heidelberg (1997). https://doi.org/10.1007/BFb0020217
Tremblay, N., Barthelme, S., Amblard, P.O.: Optimized algorithms to sample determinantal point processes. arXiv preprint arXiv:1802.08471 (2018)
Williams, C.K., Seeger, M.: Using the Nyström method to speed up kernel machines. In: Advances in Neural Information Processing Systems, pp. 682–688 (2001)
Acknowledgements
EU: The research leading to these results has received funding from the European Research Council under the European Union’s Horizon 2020 research and innovation program/ERC Advanced Grant E-DUALITY (787960). This paper reflects only the authors’ views and the Union is not liable for any use that may be made of the contained information. Research Council KUL: Optimization frameworks for deep kernel machines C14/18/068 Flemish Government: FWO: projects: GOA4917N (Deep Restricted Kernel Machines: Methods and Foundations), PhD/Postdoc grant Impulsfonds AI: VR 2019 2203 DOC.0318/1QUATER Kenniscentrum Data en Maatschappij Ford KU Leuven Research Alliance Project KUL0076 (Stability analysis and performance improvement of deep reinforcement learning algorithms).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendices
A Additional Algorithms
B Additional Figures
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Schreurs, J., Fanuel, M., Suykens, J.A.K. (2020). Towards Deterministic Diverse Subset Sampling. In: Bogaerts, B., et al. Artificial Intelligence and Machine Learning. BNAIC BENELEARN 2019 2019. Communications in Computer and Information Science, vol 1196. Springer, Cham. https://doi.org/10.1007/978-3-030-65154-1_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-65154-1_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-65153-4
Online ISBN: 978-3-030-65154-1
eBook Packages: Computer ScienceComputer Science (R0)