Skip to Main Content
In linear function computation, multiple source nodes communicate across a relay network to a single destination whose goal is to recover linear functions of the original source data. When the relay network is a linear deterministic network, a duality relation is established between function computation and broadcast with common messages. Using this relation, a compact sufficient condition is found describing those cases where the cut-set bound is tight. These insights are used to develop results for the case where the relay network contains Gaussian multiple-access channels. The proposed scheme decouples the physical and network layers. Using lattice codes for both source quantization and computation in the physical layer, the original Gaussian sources are converted into discrete sources and the Gaussian network into a linear deterministic network. Network codes for computing functions of discrete sources across the deterministic network are then found by applying the duality relation. The distortion for computing the sum of an arbitrary number of independent Gaussian sources over the Gaussian network is proven to be within a constant factor of the optimal performance. Furthermore, the constant factor results are extended to include asymmetric functions for the case of two sources.