Private vs. common random bits in communication complexity

Tags: random bits, Pcom, Foundation, Foundation of Computer Science, protocol, communication complexity, computer science, string model, classical communication problem, probabilistic protocol, PRIVATE model, properties, probability distribution, Ppri, input, communication complexity model, Boolean function, IEEE Symp, L. Babai, communication complexity theory, American Statistical Association Journal, A. Wigderson, L. Lovasz, Springer Verlag, communication, Monotone circuits, L. Adleman, player
Content: Private vs. Common Random bits in Communication Complexity Ilan Newman November 8, 1995 Abstract We investigate the relative power of the common random string model vs. the private random string model in communication complexity. We show that the two model are essentially equal. Keywords: communication complexity, randomness, Theory of Computation. Communication complexity is a model of computation where two parties, each with an input, want to mutually compute a Boolean function that is de ned on pairs of inputs. Formally, let f : X Y 7! f0; 1g be a Boolean function. The communication problem for f is the following two-player game. Player A gets x 2 X and player B gets y 2 Y . Their goal is to compute f(x; y). They have unlimited computational power and a full description of f, but they don't know each other's input. They determine the output value by exchanging messages. Let n, the length of the input, be log(jXjjY j). A protocol for computing f is a pair of algorithms (one for each player) according to which the players send binary messages. A protocol proceeds in rounds. In every round the protocol speci es which player's turn it is to send a message. Each player in his turn sends a bit message that may depend on Comp. Sci. dep. Hebrew University, Jerusalem. 1
her input and the previous messages she has received. A correct protocol for f should terminate for every input pair (x; y) 2 X Y , when both players know f(x; y) and that the protocol has terminated.
The communication complexity of a protocol P is the number of bits exchanged for the worst case input pair. The communication complexity of a Boolean function f : X Y 7! f0; 1g, is that of the best possible protocol for f.
The model was introduced by Yao, 13], and has been studied thoroughly, as has its probabilistic counterpart de ned hereafter. For a survey and exact de nitions see 2], 6]. The communication complexity model has gained increased attention recently due to the generalization of Karchmer and Wigderson, 8], which demonstrates its relation to formula complexity. See also 9].
We focus our attention on probabilistic protocols. In this setting, the
algorithms of the players may be probabilistic; i.e, they may depend on tosses
of random coins and may err. The complexity of a probabilistic protocol
P on input (x; y) , denoted by CP (x; y), is the expected number of bits
exchanged. The refer here to 2],
complexity 6] and 10]
of for
a protocol P is de nitions and
Cdi(sPcu)s=siomn aoxn(txh;ye)CpPro(xb;ayb)i.lisWtiec
communication complexity model.
Two models of access to the random bits are considered in the literature. The PRIVATE, in which each player tosses his private coin, and the COMMON, in which both players share a common random bit string. The PRIVATE model is clearly weaker than the COMMON (since the players may need to communicate their random bits). However, it is more realistic.
Our main objective is to compare the relative power of the two models.
Note that not only the cost but also the output of a probabilistic protocol P on input (x; y), denoted by P(x; y), becomes a random variable. For 0 < 1=2 let
Pcom(f) = fP 2 COMMON j 8(x; y) 2 X Y Prob(P(x; y) 6= f(x; y)) g
Ppri(f) = fP 2 PRIV ATE j 8(x; y) 2 X Y Prob(P(x; y) 6= f(x; y)) g
In words, Pcom(f) and Ppri(f) are the sets of all probabilistic protocols, of the two types, that have error probability of at most . De ne
Ccom(f) = minfC(P ) : P 2 Pcom(f)g; Cpri(f) = minfC(P ) : P 2 Ppri(f)g
2
That is, Ccom(f); (Cpri(f)) is the best complexity that a COMMON, (PRIVATE) protocol can achieve if its error probability is bounded by . Since Ppri(f) Pcom(f) we have Ccom(f) Cpri(f). Our main observa- tion is that the relative power of the two models is nearly the same. This is due to the fact that the model is non-uniform. A similar phenomenon was rst observed by Adleman 1] for the model of Boolean circuits.
Theorem 1.1 Let 0 < 1=2 and 0 < 1 then
C pri (1+
)
(f )
=
O(C com (f )
+
log
n
)
Proof It is enough to uses O(log n ) random
show bits.
that there P can be
is a protocol P directly made to
2woPrk(c1o+mwi)t(hf
) that private
random bits. Player A tosses the necessary random bits, communicates them
to B at a cost of O(log n ) and then they follow P .
The existence of P is asserted by proposition 1.1. 2
Proposition 1.1 Let P 2 Pcom(f); 0 < 1=2 then for any 0 < 1
there only
Ois(laogprno)torcaonldPom2biPts(c.1o+m
)
(f )
such
that
C(P
)
=
O(C (P ))
and
P
uses
Proof Let c = C(P). P may be thought of as a probability distribution on
a nite some
collection of xed random
sdtertienrgm. inFiusrtticheprrmotoorceo,lssinPce=tfhPerregrri==st1
, each Pi de no cost for
ned by random
bits in this model, we may assume that the probability distribution over fPrg
is uniform (by allowing identical copies of protocols in this collection). (We
avoid here questions of irrational values of probability , which are handled in
standard ways in many other discussions on the general theory of probabilistic
algorithms).
Claim 1.1 P1 with the
There is a collection properties:
of
l
=
max( (
n)2 ;
) (n+1)2 c
protocols fPi1; ::; Pilg
=
8(x; y) 2 X Y 1l jfP 2 P1 : P(x; y) 6= f(x; y)gj (1 + ) (1)
8(x; y) 2 X
Y
1 l
P2P1CP (x; y) = O(c)
(2)
3
PoorfrigPoion, failndcPoeiplclkeenctadieocnnotlllyPecat=niodnfwPLritghorrf==et1lq, u=baylmppiarcoxkb(ia(nbng)i2lli;ttyi(.mn+ceW1s)2ea)
protocols out of random protocol will show that L
the out has
properties (1) and (2) with non zero probability. For a xed (x; y) 2 X Y
let A(x; y) P rob(Pr 2
= fPi 2 A(x; y))
Lj
Pi(x; y) 6= f for every
(x; y)g Note that input (x; y). By
by the assumption on P, a Cherno ike inequality
( 11], 4])
P rob(jA(x; y)j (1 + ) l) exp(?2 2 2l) exp(?2n)
Thus, (for n 2 )
Prob(9(x; y); jA(x; y)j (1 + ) l) 2nexp(?2n) < 0:25 (3)
Observe that this corresponds to the probability that L does not have property (1). We need a similar statement for property (2), for which we use Hoe ding inequality
Lemma variables
1.1 Hoe ding(1963), 4], with values in the interval
5] Let Y1; :::; Yl be 0; z]. Let = E(1l
independent random Y jj==l1 i) Then
P
rob(
1 l
Y jj==l1 i
d)
(d?d
(
z? z?d
)z?d
)l=z
For d 2 1; z= ] this simpli es to
P
rob(
1 l
Y jj==l1 i
d)
(
e(d?1) dd
)l
=z
LlCeePmti0(moxua;ryw)L.ithbOezbLs=e=rvnef+Pt1h01;a:t:(;sPYinl10gc(xe. ;
For a xed (x; y) 2 X y); :::; Yl(x; y) meet the n is an upper bound on
Y let Yi(x; y) = assumption of the the complexity of
any protocol for f). Note that E(CPi0(x; y)) = CP (x; y) we have,
= E(1l
jj
=l =1
CP 0 j
(x;
y))
=
1 l
jj
=l =1
E
(CPj0
(x;
y
))
=
CP
(x;
y
)
c
We may assume c < n=3 (otherwise, we are immediately done). We get (for d=3)
8(x; y) 2 X
Y
P
rob(
1 l
jj
=l =1
CP 0 j
(x;
y
)
3c)
(
e2 27
)
nl+c1
(
e2 27
)n+1
3?n?1
4
Thus
Prob(9(x; y);
1 l
jj
=l =1
CP 0 j
(x;
y
)
3c)
2n3?n?1 0:25
(4)
Therefore, there is some choice of L for which both properties (1) and (2)
hold. This completes the proof of the claim. 2
We let P be the probabilistic protocol that picks at random, with uniform distribution, a protocol from the set L. It is guaranteed by claim 1.1 that P has error bound of (1 + ) and it uses only log l = O(log n + log(1=( ))) random bits. This completes the proof of proposition 1.1 2.
Corollary 1.1 Let n?c < < 1=2 ? n?c for some constant c. Then Cpri(f) = O(Ccom(f) + log n)
Proof: Pick so that (1 + ) < 1=2 ? n?c and > n?2c (clearly such
exists). By theorem 1.1, with C(P1) = O(C(P ) +
for log
any P 2 Pcom there is n). The error of P1 can
a protocol P1 2 P be reduced (back)
pri to(1+
),
keeping the same order of magnitude of complexity, by standard ampli cation
methods. ips and
(We repeat P1 a take the majority
constant number of the outputs).
of times 2.
with
independent
coin
protAoncoilms tphoarttaanlwt acylassasnosfweprroctoorcroelcstlyar(eLPas0coVme(gfa)s,).anWdePh0aprvie(f). Those are
Theorem 1.2
C0pri(f) = O(C0com(f) + log n)
Proof: As in the proof of theorem 1.1, except that property (1) holds with probability 1 for every L. So it su ces to pick l = (n + 1)2=c in order to guarantee property (2). 2. Remarks:
1. An important generalization of this classical communication problem is the Karchmer-Wigderson game 8]. Namely, let g : f0; 1gn 7! f0; 1g be a Boolean function, let X = g?1(1) and Y = g?1(0). De ne the relation Rg X Y f1; ::; ng by (x; y; i) 2 Rg i xi 6= yi, where xi; (yi) is
5
the i?th bit of One player get
x (y). input
The communication x 2 X and the other
problem gets y 2
for Y.
Rg is as follows. Their task is to
agree on an i such that (x; is its relation to the formula
y; i) size
2ofRgg.
The importance of this game 8]. Probabilistic protocols are
de ned in a manner similar to the classical case. Our result carries on
to this framework too, by the same proof.
2. In the proofs of theorem 1.1 and theorem 1.2 we only used the inherent power of the nonuniformity of the model, and the fact that the complexity was naturally bounded by n+1. For any nonuniform model with a similar property, our proof asserts that one can use a small amount of randomness, (i.e, the decision tree model, non uniform routing etc.). Indeed the same idea was used in 1], 7].
3. It is well known that for every pair of constants 0 ; 0 if P 2 Pcom(f)
with
C(P
)
=
c,
there
exists
a
P
0
2
P
com 0
(f
)
whose
complexity
is
O(c)
for every (x; y) and all coin tosses. Using this, theorem 1.1 can be
proved just by asserting property (1) (property (2) will hold with prob-
ability 1). However, this does not work for non-constant error.
4. Our result was recently used in the work of Canetti, and Goldreich, in a study on communication-randomness tradeo s 3].
References 1] L. Adleman, Two theorems om random polynomial time, Proceedings of the 19th Annual IEEE Symposium on Foundation of computer science, 75-83. 2] L. Babai, P. Frankl, J. Simon, Complexity classes in communication complexity theory, Proc. 27th Annual IEEE Symp. on Foundation of computer science, 1986, 337-347 3] R. Canetti, O. Goldreich, Bounds on Tradeo s between randomness and communication complexity, 31th Annual IEEE Symp. on Foundation of computer science, 1990, 767-775.
6
4] W. Hoe ding, Probability inequalities for sums of bounded random variables, American Statistical Association Journal (1963), 13-30 5] M. Hofri, Probabilistic Analysis of algorithms, Springer Verlag, New York, 1987, pp 104 6] L. Lovasz, Communication complexity: a survey, in Path owers and VLSI, (Ed, B. Korte, L.Lovasz, A. Schrijver, ), Springer Verlag, 1990. 7] Krizanc, D. Peleg, E. Upfal, A Time-Randomness Tradeo for oblivious routing, Proc. 20th Annual ACM Symp. on theory of computing, 1988, 93-102. 8] M. Karchmer, A. Wigderson, Monotone circuits for connectivity require super- logarithmic depth, Proceedings of 20th Annual ACM Symposium on Theory of Computing, (1988) 539-550. 9] R. Raz, A. Wigderson, Monotone circuits for matching require linear depth, Proc. 22th Annual ACM Symp. on theory of computing, 1990, 287-292. 10] R. Raz, A. Wigderson, Probabilistic communication complexity of Boolean relations, Proc. 30th Annual IEEE Symp. on Foundation of Computer Science, 1989, 562-567. 11] J. Spencer, Probabilistic methods in combinatorics, to appear in Handbook of combinatorics. 12] A. C. Yao, Probabilistic computation, towards a uni ed measure of complexity, Proc. 18th Annual IEEE Symp. on Foundation of Computer Science, 1977, 222-227. 13] A.C. Yao, Some complexity questions related to distributive computing, Proc. 11th Annual IEEE Symp. on Foundation of computer science, 1979, 151-158 7

File: private-vs-common-random-bits-in-communication-complexity.pdf
Published: Mon Nov 5 02:16:00 2007
Pages: 7
File size: 0.13 Mb


The future of competition, 8 pages, 0.34 Mb

, pages, 0 Mb

Lord Siva and His Worship, 177 pages, 0.47 Mb
Copyright © 2018 doc.uments.com