Abstract
In this paper, we study the problem of maximizing a nonmonotone one-sided-\(\eta \) smooth (OSS for short) function \(\psi (x)\) under a downwards-closed convex polytope constraint. The concept of OSS was first proposed by Mehrdad et al. [1, 2] to express the properties of multilinear extension of some set functions. It is a generalization of the continuous DR submodular function. The OSS property guarantees an alternative bound based on Taylor expansion. If the objective function is nonmonotone diminishing return (DR) submodular, Bian et al. [3] gave a 1/e approximation algorithm with a regret bound \(O(\frac{LD^{2}}{2K})\). On general convex sets, D\(\ddot{u}\)rr et al. [4] gave a \(\frac{1}{3\sqrt{3}}\) approximation solution with \(O(\frac{LD^{2}}{(\ln K)^{2}})\) regrets. In this paper, we consider maximizing the more general OSS function, and by adjusting the iterative step of the Jump-Start Frank Wolfe algorithm, an approximation of 1/e can still be obtained in the case of a larger regret bound \(O(\frac{L(\mu D)^{2}}{2K})\). (where \(L,\mu , D\) are some parameters, see Table 1). The larger the parameter \(\eta \) we choose, the more regrets we will receive, because of \(\mu =\left( \frac{\beta }{\beta +1}\right) ^{-2\eta }\) \((\beta \in (0,1])\).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Mehrdad, G., Richard, S., Bruce, S.: Beyond submodular maximization, pp. 1–60. arXiv preprint arXiv:1904.09216 (2019)
Mehrdad, G., Richard, S., Bruce, S.: Beyond submodular maximization via one-sided smoothness. In: Proceedings of SODA, pp. 1006–1025 (2021)
Bian, A., Levy, K., Krause, A., Buhmann, J.: Non-monotone continuous DR-submodular maximization: structure and algorithms. In: Proceedings of NeurIPS, pp. 487–497 (2018)
D\(\ddot{u}\)rr, C., Th\(\check{a}\)ng, N., Srivastav, A., Tible, L.: Non-monotone DR-submodular maximization over general convex sets. In: Proceedings of IJCAI, pp. 2148–2154 (2021)
Chandra, C., Quanrud, K.: Submodular function maximization in parallel via the multilinear relaxation. In: Proceedings of the SODA, pp. 303–322 (2019)
Radlinski, F., Dumais, S.: Improving personalized web search using result diversification. In: Proceedings of ACM SIGIR, pp. 691–692 (2006)
Ghadiri, M., Schmidt, M.: Distributed maximization of submodular plus diversity functions for multilabel feature selection on huge datasets. In: Proceedings of AISTATS, pp. 2077–2086 (2019)
Zadeh, S., Ghadiri, M., Mirrokni, V., Zadimoghaddam, M.: Scalable feature selection via distributed diversity maximization. In: Proceedings of AAAI, pp. 2876–2883 (2017)
Abbassi, Z., Mirrokni, V., Thakur, M.: Diversity maximization under matroid constraints. In: Proceedings of ACM SIGKDD, pp. 32–40 (2013)
Xin, D., Cheng, H., Yan, X., Han, J.: Extracting redundancy-aware top-\(k\) patterns. In: Proceedings of ACM SIGKDD, pp. 444–453 (2006)
Carbonell, J., Goldstein, J.: The use of MMR, diversity-based reranking for reordering documents and producing summaries. In: Proceedings of ACM SIGIR, pp. 335–336 (1998)
Bian, A., Levy, K., Krause, A., Buhmann, J.: Continuous DR-submodular maximization: structure and algorithms. In: Proceedings of NeurIPS, pp. 486–496 (2017)
Bian, Y., Buhmann, J., Krause, A.: Optimal continuous DR-submodular maximization and applications to provable mean field inference. In: Proceedings of ICML, pp. 644–653 (2019)
Niazadeh, R., Roughgarden, T., Wang, J.: Optimal algorithms for continuous non-monotone submodular and DR-submodular maximization. J. Mach. Learn. Res. 21, 1–31 (2020)
Buchbinder, N., Feldman, M.: Deterministic algorithms for submodular maximization problems. ACM Trans. Algorithms 14, 1–20 (2018)
Buchbinder, N., Feldman, M., Seffi, J., Schwartz, R.: A tight linear time (1/2)-approximation for unconstrained submodular maximization. SIAM J. Comput. 44, 1384–1402 (2015)
Martin, J.: Revisiting Frank-Wolfe: projection-free sparse convex optimization. In: Proceedings of ICML, pp. 427–435 (2013)
Freund, R., Grigas, P.: New analysis and results for the Frank-Wolfe method. Math. Program. 155, 199–230 (2016). https://doi.org/10.1007/s10107-014-0841-6
Zhang, M., Shen, Z., Mokhtari, A., Hassani, H., Karbasi, A.: One sample stochastic Frank-Wolfe. In: Proceedings of ICAIS, pp. 4012–4023 (2020)
Feldman, M., Naor, J., Schwartz, R.: A unified continuous greedy algorithm for submodular maximization. In: Proceedings of FOCS, pp. 570–579 (2011)
Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course, vol. 87. Springer, New York (2003). https://doi.org/10.1007/978-1-4419-8853-9
Bian, Y., Buhmann, J., Krause, A.: Continuous submodular function maximization, pp. 1–64. arXiv preprint arXiv:2006.13474 (2020)
Acknowledgements
The first author is supported by National Natural Science Foundation of China (No. 12131003) and General Research Projects of Beijing Educations Committee in China under Grant (No. KM201910005013).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, H., Hao, C., Guo, W., Zhang, Y. (2023). Frank Wolfe Algorithm for Nonmonotone One-Sided Smooth Function Maximization Problem. In: Dinh, T.N., Li, M. (eds) Computational Data and Social Networks . CSoNet 2022. Lecture Notes in Computer Science, vol 13831. Springer, Cham. https://doi.org/10.1007/978-3-031-26303-3_23
Download citation
DOI: https://doi.org/10.1007/978-3-031-26303-3_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-26302-6
Online ISBN: 978-3-031-26303-3
eBook Packages: Computer ScienceComputer Science (R0)