Sparse pixel sampling for appearance edit propagation
Edit propagation is an appearance-editing method using sparsely provided edit strokes from users. Although edit propagation has a wide variety of applications, it is computationally complex, owing to the need to solve large linear systems. To reduce the computational cost, interpolation-based approaches have been studied intensely. This study is inspired by an interpolation-based edit-propagation method that uses a clustering algorithm to determine samples. The method uses an interpolant, which approximates edit parameters with convex combinations of the samples. However, because the clustering algorithm generates samples that lie inside the set of pixels in a feature space, an interpolant with convex combinations does not allow for an exact reconstruction of the pixels outside the convex hull. To address this issue, this paper proposes a novel approximation model for interpolating image colors as well as edit parameters using affine combinations. In addition, this paper introduces sparse pixel sampling to determine the quantity and positions of samples and the weight coefficients of the affine combinations simultaneously. Sparse pixel sampling is performed by updating candidate pixels. Unnecessary pixels are discarded with compressive sensing, and new candidate pixels are greedily resampled following their approximation errors. This paper demonstrates that the proposed model achieves better approximation in terms of both image colors and edit parameters, and discusses the properties of the proposed model with various experiments.
KeywordsImage and video editing Interactive editing Edit propagation Compressive sensing
We would like to thank all anonymous reviewers for their constructive comments and suggestions. We also thank flickr users “Martin Abegglen (twicepix),” “Bahman Farzad (Bahman Farzad),” “John VanderWaagen (jvh33)” and “fluffster (Jeanie)” for the pictures used in the paper. This research is supported in part by Research Fellowship for Young Scientists of Japan Society for the Promotion of Science and by Core Research for Evolutional Science and Technology of Japan Science and Technology Agency.
- 1.Eigen 3.2.4, a c++ template library for linear algebra: matrices, vectors, numerical solvers, and related algorithms. http://eigen.tuxfamily.org/index.php (2015)
- 9.Chen, X., Zou, D., Zhao, Q., Tan, P.: Manifold preserving edit propagation. ACM Trans. Graph. 31(6), 132:1–132:7 (2012)Google Scholar
- 12.Farbman, Z., Fattal, R., Lischinski, D., Szeliski, R.: Edge-preserving decompositions for multi-scale tone and detail manipulation. ACM Trans. Graph. 27(3), 67:1–67:10 (2008)Google Scholar
- 16.He, K., Sun, J., Tang, X.: Guided image filtering. In: Proceedings of ECCV 2010. Lecture Notes in Computer Science, vol. 6311, pp. 1–14. Springer (2010)Google Scholar
- 25.Pellacini, F., Lawrence, J.: Appwand: editing measured materials using appearance-driven optimization. ACM Trans. Graph. 26(3), 51:1–54:8Google Scholar
- 29.Tomasi, C., Manduchi, R.: Bilateral filtering for gray and color images. In: Proceedings of 6th international conference on computer vision, pp. 839–846. IEEE (1998)Google Scholar
- 31.Williams, C., Seeger, M.: Using the Nyström method to speed up kernel machines. In: Advances in Neural Information Processing Systems, Proceedings of the 15th annual conference on Neural Information Processing Systems, pp. 682–688 (2001)Google Scholar
- 35.Xu, K., Li, Y., Ju, T., Hu, S.M., Liu, T.Q.: Efficient affinity-based edit propagation using k-d tree. ACM Trans. Graph. 28(5), 118:1–118:6 (2009)Google Scholar
- 37.Xu, L., Yan, Q., Jia, J.: A sparse control model for image and video editing. ACM Trans. Graph. 32(6), 197:1–197:10 (2013)Google Scholar