In theoretical computer science, multiparty communication complexity is the study of communication complexity in the setting where there are more than 2 players.

In the traditional two–party communication game, introduced by Yao (1979),[1] two players, P1 and P2 attempt to compute a Boolean function

Player P1 knows the value of x2, P2 knows the value of x1, but Pi does not know the value of xi, for i = 1, 2.

In other words, the players know the other's variables, but not their own. The minimum number of bits that must be communicated by the players to compute f is the communication complexity of f, denoted by κ(f).

The multiparty communication game, defined in 1983,[2] is a powerful generalization of the 2–party case: Here the players know all the others' input, except their own. Because of this property, sometimes this model is called "numbers on the forehead" model, since if the players were seated around a round table, each wearing their own input on the forehead, then every player would see all the others' input, except their own.

The formal definition is as follows: players: intend to compute a Boolean function

On set of variables there is a fixed partition of classes , and player knows every variable, except those in , for . The players have unlimited computational power, and they communicate with the help of a blackboard, viewed by all players.

The aim is to compute ), such that at the end of the computation, every player knows this value. The cost of the computation is the number of bits written onto the blackboard for the given input and partition . The cost of a multiparty protocol is the maximum number of bits communicated for any from the set {0,1}n and the given partition . The -party communication complexity, of a function , with respect to partition , is the minimum of costs of those -party protocols which compute . The -party symmetric communication complexity of is defined as

where the maximum is taken over all k-partitions of set .

Upper and lower bounds

For a general upper bound both for two and more players, let us suppose that A1 is one of the smallest classes of the partition A1,A2,...,Ak. Then P1 can compute any Boolean function of S with |A1| + 1 bits of communication: P2 writes down the |A1| bits of A1 on the blackboard, P1 reads it, and computes and announces the value . So, the following can be written:

The Generalized Inner Product function (GIP)[3] is defined as follows: Let be -bit vectors, and let be the times matrix, with columns as the vectors. Then is the number of the all-1 rows of matrix , taken modulo 2. In other words, if the vectors correspond to the characteristic vectors of subsets of an element base-set, then GIP corresponds to the parity of the intersection of these subsets.

It was shown[3] that

with a constant c > 0.

An upper bound on the multiparty communication complexity of GIP shows[4] that

with a constant c > 0.

For a general Boolean function f, one can bound the multiparty communication complexity of f by using its L1 norm[5] as follows:[6]

Multiparty communication complexity and pseudorandom generators

A construction of a pseudorandom number generator was based on the BNS lower bound for the GIP function.[3]

  1. Yao, Andrew Chi-Chih (1979), "Some complexity questions related to distributive computing", Proceedings of the 11th ACM Symposium on Theory of Computing (STOC '79), pp. 209–213, doi:10.1145/800135.804414, S2CID 999287.
  2. Chandra, Ashok K.; Furst, Merrick L.; Lipton, Richard J. (1983), "Multi-party protocols", Proceedings of the 15th ACM Symposium on Theory of Computing (STOC '83), pp. 94–99, doi:10.1145/800061.808737, ISBN 978-0897910996, S2CID 18180950.
  3. 1 2 3 Babai, László; Nisan, Noam; Szegedy, Márió (1992), "Multiparty protocols, pseudorandom generators for logspace, and time-space trade-offs", Journal of Computer and System Sciences, 45 (2): 204–232, doi:10.1016/0022-0000(92)90047-M, MR 1186884.
  4. Grolmusz, Vince (1994), "The BNS lower bound for multi-party protocols is nearly optimal", Information and Computation, 112 (1): 51–54, doi:10.1006/inco.1994.1051, MR 1277711.
  5. Bruck, Jehoshua; Smolensky, Roman (1992), "Polynomial threshold functions, AC0 functions, and spectral norms" (PDF), SIAM Journal on Computing, 21 (1): 33–42, doi:10.1137/0221003, MR 1148813.
  6. Grolmusz, V. (1999), "Harmonic analysis, real approximation, and the communication complexity of Boolean functions", Algorithmica, 23 (4): 341–353, CiteSeerX 10.1.1.53.6729, doi:10.1007/PL00009265, MR 1673395, S2CID 26779824.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.