Skip to main content
  • 3828 Accesses

Abstract

It is natural and convenient to use matrix notation and terminology in defining and discussing certain functions of one or more variables. Earlier (in Chapter 14), matrix notation and terminology were used in defining and discussing linear, bilinear, and quadratic forms. In some cases, the use of matrix notation and terminology is essentially unavoidable—consider, for example, a case where the determinant of an m × m matrix is regarded as a function of its m2 elements.

Matrix differentiation is the derivation of the first-, second-, or higher-order partial derivatives of a function or functions that have been expressed in terms of matrices. In deriving, presenting, and discussing the partial derivatives of such functions, it is natural, convenient, and in some cases necessary to employ matrix notation and terminology. Not only may the functions be expressed in terms of matrices, but the functions may comprise the elements of a vector or matrix, as would be the case if the elements of the inverse of an m×m (nonsingular) matrix A were regarded as functions of the elements of A.

Matrix differentiation is of considerable importance in statistics. It is especially useful in connection with the maximum likelihood estimation of the parameters in a statistical model. The maximum likelihood estimates of the model’s parameters satisfy the equations (known as the likelihood equations) obtained by equating to zero the first-order partial derivatives (with respect to the model’s parameters) of the logarithm of the so-called likelihood function—in many important cases, the likelihood function involves the determinant and/or inverse of a matrix. Further, an approximation (suitable for large samples) to the variance-covariance matrix of the maximum likelihood estimators can be obtained by inverting the matrix (known as the information matrix) whose ij th element is–1 times the second-order partial derivative (with respect to the i th and j th parameters) of the logarithm of the likelihood function.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David A. Harville .

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag New York, Inc.

About this chapter

Cite this chapter

Harville, D.A. (1997). Matrix Differentiation. In: Matrix Algebra From a Statistician’s Perspective. Springer, New York, NY. https://doi.org/10.1007/0-387-22677-X_15

Download citation

  • DOI: https://doi.org/10.1007/0-387-22677-X_15

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-0-387-94978-9

  • Online ISBN: 978-0-387-22677-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics