Skip to Main Content
Base station antenna downtilt is one of the most important parameters for optimizing a cellular network with tight frequency reuse. By downtilting, inter-site interference is reduced, which leads to an improved performance of the network. In this study we show that a simple geometrical-based extension to standard empirical path loss prediction models can give quite reasonable accuracy in predicting the signal strength from tilted base station antennas in small urban macro-cells. Our evaluation is based on measurements on several sectors in a 2.6 GHz Long Term Evolution (LTE) cellular network, with electrical antenna downtilt in the range from 0 to 10 degrees, as well as predictions based on ray-tracing and 3D building databases covering the measurement area. Although the calibrated ray-tracing predictions are highly accurate compared with the measured data, the combined LOS/NLOS COST-WI model with downtilt correction performs similarly for distances above a few hundred meters. Generally, predicting the effect of base station antenna tilt close to the base station is difficult due to multiple vertical sidelobes.