Wed, 01.10.2014
The running time of a distributed algorithm is commonly studied as a function of two parameters: n is the number of nodes in the network, and Δ is the maximum degree of the network. For many computational problems, it is nowadays relatively well understood how the running time depends on n. However, the dependence on Δ is still very poorly understood.
There are many classical graph problems for which the best algorithms have running times that are linear in Δ, but we do not known if any of these algorithms are optimal. The best lower bounds have been so far only logarithmic in Δ.
This work gives the first linear-in-Δ lower bound for a natural graph problem in the LOCAL model, which is the standard model used in the study of distributed algorithms. By prior work, there is a distributed algorithm that finds a maximal fractional matching (maximal edge packing) in O(Δ) rounds. We show that this is optimal: there is no distributed algorithm that finds a maximal fractional matching in o(Δ) rounds.
Mika Göös, Juho Hirvonen, Jukka Suomela. Linear-in-Δ lower bounds in the LOCAL model. Symposium on Principles of Distributed Computing (PODC), July 2014
Last updated on 1 Oct 2014 by Jukka Suomela - Page created on 1 Oct 2014 by Jukka Suomela