Advertisement

computational complexity

, Volume 17, Issue 2, pp 149–178 | Cite as

Halfspace Matrices

  • Alexander A. SherstovEmail author
Article

Abstract.

We introduce the notion of a halfspace matrix, which is a sign matrix A with rows indexed by linear threshold functions f, columns indexed by inputs x ∈ {− 1, 1} n , and the entries given by A f,x f(x). We use halfspace matrices to solve the following problems.

In communication complexity, we exhibit a Boolean function f with discrepancy Ω(1/n 4) under every product distribution but \(O(\sqrt{n}/2^{n/4})\) under a certain non-product distribution. This partially solves an open problem of Kushilevitz and Nisan (1997).

In learning theory, we give a short and simple proof that the statisticalquery (SQ) dimension of halfspaces in n dimensions is less than 2\((n + 1)^2\) under all distributions. This improves on the \(n^{O(1)}\) estimate from the fundamental paper of Blum et al. (1998). We show that estimating the SQ dimension of natural classes of Boolean functions can resolve major open problems in complexity theory, such as separating PSPACE \(^{cc}\) and PH \(^{cc}\).

Finally, we construct a matrix \(A \in \{-1, 1\}^{N \times N^{\text{log} N}}\) with dimension complexity logN but margin complexity \(\Omega(N^{1/4}/\sqrt{\log N})\). This gap is an exponential improvement over previous work. We prove several other relationships among the complexity measures of sign matrices, omplementing work by Linial et al. (2006, 2007).

Keywords.

Linear threshold functions communication complexity complexity measures of sign matrices complexity of learning 

Subject classification.

03D15 68Q15 68Q17 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Birkhaeuser 2008

Authors and Affiliations

  1. 1.Department of Computer SciencesThe University of Texas at AustinAustinUSA

Personalised recommendations