Skip to Main Content
One of the most important issues in a wireless sensor network is energy efficiency, in order to extend the lifetime of the network. An effective strategy is to turn off the redundant sensor nodes in the network to spare energy. In this paper, we propose and analyze an adaptive regression algorithm for dynamic environments that can continuously monitor two arbitrary sensors in a sensor field and decide on whether they can be mutually described by non isotonic linear relation, within a user specified error bound, or not. This is done without the need of offline pre-computations, dedicated phases, or base station assistance; thus, it can be utilized in fully distributed manner. The algorithm can dynamically eliminate the redundancy and estimate the deficient data based on the learned relations in a way to ensure that the sensors' energy consumption is near minimal and balanced. We compare our technique with deterministic clustering methods, provide a parameter sensitivity analysis and discuss the simulation results.