6. Query Complexity II
We have seen in the last lecture that there are separations between deterministic and zero-error randomized query complexity for both partial and total Boolean functions. In this lecture, we continue our investigations into the power of randomness by examining whether there are any such gaps when we compare zero-error and bounded-error randomized query complexity.
Separations between and
The question of whether allowing bounded error probability reduces query complexity is made precise by asking: is there any asymptotic separation between
whenever
For partial functions, the situation is similar to the one we observed when comparing deterministic and zero-error randomized query complexity: we can construct partial functions for which the maximum possible separation between our two randomized query complexity measures is achieved.
Theorem 1. There exists a partial function
, for which
Proof. Consider the Gap Majority function where
and
is defined by . (The Gap Majority function is identical to the standard Majority function, but has the additional promise that the inputs are always heavily biased towards the output value.) Any zero-error randomized algorithm for the Gap Majority function must have query complexity at least
because with fewer queries, it is always possible that it was extremely unlucky and queried only the minority bits. With bounded error, however, it suffices to query a single bit of the input uniformly at random and to output its value.
Once again, the situation is quite different when we restrict our attention to total functions. In this setting, there is again essentially only a quadratic gap between zero-error and bounded-error randomized query complexity.
Theorem 2. For every Boolean function
,
The upper bound was obtained by Kulkarni and Tal (2016) who provided a tight analysis of an earlier argument of Midrijānis (2005). One of the key aspects of this argument is success amplification: through a connection to (variants of) block sensitivity, the core of the argument lies in showing that it suffices to show the existence of a randomized algorithm with polynomially small error on each input to show the existence of a zero-error randomized algorithm.
Shortcut functions, revisited
Unlike in the situation with deterministic vs. zero-error randomized query complexity, there was no natural total Boolean function (like the recursive NAND function in the former setting) known to show an asymptotic separation between the two randomized measures of query complexity.
So in this case, until recently it was open whether any polynomial or even just asymptotic separation held between
Ambainis, Balodis, Belovs, Lee, Santha, and Smotrovs (2017), in the same paper in which they established the near-optimal separation between deterministic and zero-error randomized query complexity that we saw in the last lecture, also established a near-optimal separation between zero-error and bounded-error randomized query complexity.
Theorem 3. There is a total function
that satisfies
The key to the proof of this theorem is again to design an appropriate “shortcut” in the function that can only be efficiently accessed under one of the models of computation. This is again done by considering an appropriate modification of the original pointer functions of Göös, Pitassi, and Watson (2015), but in this case to make the shortcuts useful only when the randomized algorithm is allowed to err with bounded probability.
Separations between and
If we combine the upper bounds on the maximum separation between the various query complexity measures, we obtain the conclusion that for every total function
Is that bound tight? No!
Theorem 4. For every
,
This bound was obtained by Nisan (1991). We can prove it by combining certificate and block sensitivity complexity. Let us first introduce the latter complexity measure.
The sensitivity of an input
(Note that the average sensitivity complexity measure obtained by replacing the maximum with an expectation over the uniform distribution on
The block sensitivity of an input
The sensitivity and block sensitivity complexity measures satisfy the obvious bound
As a first step, we have the following upper bound originally due to Beals, Buhrman, Cleve, Mosca, and de Wolf (2001).
Lemma 5. For every
,
The proof of this lemma can be completed with an argument that is very similar to the one for the
We can also bound the certificate complexity of a function by its block sensitivity in the following way.
Lemma 6. For every
,
And the final ingredient of the proof is the observation that block sensitivity provides a lower bound on bounded-error randomized query complexity.
Lemma 7. For every
,
The proof of Lemma 7 is obtained via Yao’s Minimax Principle: let
Combining Lemmas 5-7 completes the proof of Theorem 4. Note however that it is still open whether the cubic bound in that theorem can be achieved; the best separation we have between deterministic and bounded-error randomized query complexity is nearly quadratic bound that we have already observed when comparing deterministic vs. zero-error or zero-error vs. bounded-error randomized complexity measures.
There are also a number of other different complexity measures for which we still do not know what is the best possible separation that is achievable for total function. For a good overview of these separation questions as well as a summary of the recent state of the art and pointers to relevant literature, see Table 1 in the recent article of Aaronson, Ben-David, Kothari, Rao, and Tal (2021).