In this study, we are concerned with a simple error control protocol, the "send and wait" protocol, which uses the classical technique of positive acknowledgment and time-out periods. We first analyze the influence of the time-out on the packet transmission rate. Then we use a queuing analysis to obtain the ergodicity condition and to compute the buffer queue length probability distribution. Finally we compute buffer overflow when a finite number of packets are allowed to enter the node. This analysis allows us to obtain optimum values of the time-out in order to maximize throughput, or to minimize average transit delay through the node or buffer overflow probabilities.