Skip to Main Content
LTE has been proposed as a possible wide area communications network for the Smart Grid. We analyze the performance of the LTE TDD mode when the uplink traffic (such as AMI meter readings and sensor data) is significantly higher than downlink traffic. We derive theoretical best case mean uplink latency figures for the relevant subset of LTE TDD uplink/downlink configurations (0, 1 and 6) and validate the findings using an OPNET simulation model. This leads to the conclusion that configuration 1 provides optimum uplink latency performance in general, but at high uplink traffic levels, only configuration 0 can reliably be used.