Advertisement

Letters in Mathematical Physics

, Volume 105, Issue 5, pp 675–692 | Cite as

On the Joint Convexity of the Bregman Divergence of Matrices

  • József Pitrik
  • Dániel VirosztekEmail author
Article

Abstract

We characterize the functions for which the corresponding Bregman divergence is jointly convex on matrices. As an application of this characterization, we derive a sharp inequality for the quantum Tsallis entropy of a tripartite state, which can be considered as a generalization of the strong subadditivity of the von Neumann entropy. (In general, the strong subadditivity of the Tsallis entropy fails for quantum states, but it holds for classical states.) Furthermore, we show that the joint convexity of the Bregman divergence does not imply the monotonicity under stochastic maps, but every monotone Bregman divergence is jointly convex.

Mathematics Subject Classification

46N50 46L30 81Q10 

Keywords

joint convexity Bregman divergence Tsallis entropy monotonicity 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aczél J., Daróczy Z.: On measures of information and their characterizations. Academic Press, San Diego (1975)zbMATHGoogle Scholar
  2. 2.
    Ando T., Hiai F.: Operator log-convex functions and operator means. Mathematische Annalen. 350(3), 611–630 (2011)CrossRefzbMATHMathSciNetGoogle Scholar
  3. 3.
    Banerjee, A., et al.: Clustering with Bregman divergences. J. Mach. Learn. Res. 6, 1705–1749 (2005)Google Scholar
  4. 4.
    Bauschke, H., Borwein, J.: Joint and separate convexity of the Bregman distance. In: Butnariu, D., Censor, Y., Reich, S. (eds.) Inherently parallel algorithms in feasibility and optimization and their applications (Haifa 2000), pp. 23–36. Elsevier (2001)Google Scholar
  5. 5.
    Besenyei Á., Petz D.: Partial subadditivity of entropies. Linear Algebra Appl. 439, 3297–3305 (2013)CrossRefzbMATHMathSciNetGoogle Scholar
  6. 6.
    Bhatia R.: Matrix analysis. Springer, New York (1996)zbMATHGoogle Scholar
  7. 7.
    Bregman L.M.: The relaxation method of finding the common points of convex sets and its application to the solution of problems in convex programming. USSR Comput. Math. Math. Phys. 7(3), 200–217 (1967)CrossRefGoogle Scholar
  8. 8.
    Carlen, E.: Trace Inequalities and quantum entropy: an introductory course. Contemp. Math. 529, 73–140 (2010)Google Scholar
  9. 9.
    Chen R.Y., Tropp J.A.: Subadditivity of matrix φ-entropy and concentration of random matrices. Electron. J. Probab. 19, 1–30 (2014)CrossRefMathSciNetGoogle Scholar
  10. 10.
    Daróczi Z.: General information functions. Inf. Control. 16, 36–51 (1970)CrossRefGoogle Scholar
  11. 11.
    Furuichi S., Yanagi K., Kuriyama K.: Fundamental properties of Tsallis relative entropy. J. Math. Phys. 45, 4868–4877 (2004)CrossRefADSzbMATHMathSciNetGoogle Scholar
  12. 12.
    Furuichi S.: Information theoretical properties of Tsallis entropies. J. Math. Phys. 47, 023302 (2006)CrossRefADSMathSciNetGoogle Scholar
  13. 13.
    Hansen, F.: Extensions of Lieb’s concavity theorem. J. Stat. Phys. 124, 87–101 (2006)Google Scholar
  14. 14.
    Hansen F.: Trace functions as Laplace transforms. J. Math. Phys. 47, 043504 (2006)CrossRefADSMathSciNetGoogle Scholar
  15. 15.
    Hansen, F., Zhang, Z.: Characterization of matrix entropies. (2014). arXiv:1402.2118v2
  16. 16.
    Hiai, F., Petz, D.: Introduction to matrix analysis and applications. Hindustan Book Agency and Springer Verlag (2014)Google Scholar
  17. 17.
    Itakura, F., Saito, S.: Analysis synthesis telephony based on the maximum likelihood method, In: 6th Int. Congr. Acoustics, Tokyo, pp. C-17–C-20 (1968)Google Scholar
  18. 18.
    Kullback S., Leibler R.A.: On information and sufficiency. Ann. Math. Stat. 22(1), 79–86 (1951)CrossRefzbMATHMathSciNetGoogle Scholar
  19. 19.
    Lesniewski A., Ruskai M.B.: Monotone riemannian metrics and relative entropy on non-commutative probability spaces. J. Math. Phys. 40, 5702–5724 (1999)CrossRefADSzbMATHMathSciNetGoogle Scholar
  20. 20.
    Lieb E.H., Ruskai M.B.: Some operator inequalities of the schwarz type. Adv. Math. 12, 269–273 (1974)CrossRefzbMATHMathSciNetGoogle Scholar
  21. 21.
    Lewin M., Sabin J.: A family of monotone quantum relative entropies. Lett. Math. Phys. 104, 691–705 (2014)CrossRefADSzbMATHMathSciNetGoogle Scholar
  22. 22.
    Linblad G.: Expectations and entropy inequalities. Commun. Math. Phys. 39, 111–119 (1974)CrossRefADSGoogle Scholar
  23. 23.
    Mahalonobis, P.C.: On the generalized distance in statistics, Proc. Natl. Inst. Sci. 12, 49–55 (1936)Google Scholar
  24. 24.
    Nielsen M., Petz D.: A simple proof of the strong subadditivity inequality. Quantum Inf. Comput. 6, 507–513 (2005)MathSciNetGoogle Scholar
  25. 25.
    Petz D.: Bregman divergence as relative operator entropy. Acta Math. Hung. 116, 127–131 (2007)CrossRefzbMATHMathSciNetGoogle Scholar
  26. 26.
    Petz, D., Virosztek, D.: Some inequalities for quantum Tsallis entropy related to the strong subadditivity. Math. Inequal. Appl. 18(2), 555–568 (2015)Google Scholar
  27. 27.
    Tropp J.A.: From joint convexity of quantum relative entropy to a concavity theorem of Lieb. Proc. Am. Math. Soc. 140, 1757–1760 (2012)CrossRefzbMATHMathSciNetGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  1. 1.Department of Mathematical AnalysisBudapest University of Technology and EconomicsBudapestHungary

Personalised recommendations