Skip to Main Content
We address the problem of allocation of a very large number of files in a computer network. Each file may be stored at one or more nodes, depending on query/update traffic, storage and transmission costs, and reliability constraints. Certain network constraints (storage capacities, transmission capacities) must also be satisfied. We consider the case of several thousand files, and show that, by appropriate formulation of the problem, a decentralized approach may be used to solve previously intractable problems. The question of duality gaps, as well as solution algorithms, are addressed. For the problem faced by a "Network Manager" whose task is to keep a given network operational in the face of constant arrivals of new files, and changing characteristics of old files, it is shown that there should be no duality gap and known algorithms can be used for efficient solution. Our aim is to provide motivation and a sound framework for further research in this area.