This paper presents a preliminary investigation into the impact of TCP's advertised receive buffer size and timer granularity on TCP performance over erroneous links in a LEO satellite environment. Conducted simulations include over 200 different combinations of TCP flavor, advertised receive buffer size, timer granularity, and bit error rate. Results show that TCP can only approximate the variable propagation delay for unsaturated links and that the minimum timer granularity which prevents a premature expiration of the RTO depends on the advertised receive buffer size. Low BERs do not influence TCP's capability to track the variable propagation delay in contrast to high BERs. Final results indicate that the relative performance degradation of TCP over erroneous links does not depend on the ability to estimate the variable propagation delay.