Skip to Main Content
Network intrusion detection and prevention systems are vulnerable to evasion by attackers who craft ambiguous traffic to breach the defense of such systems. A normalizer is an inline network element that thwarts evasion attempts by removing ambiguities in network traffic. A particularly challenging step in normalization is the sound detection of inconsistent TCP retransmissions, wherein an attacker sends TCP segments with different payloads for the same sequence number space to present a network monitor with ambiguous analysis. Normalizers that buffer all unacknowledged data to verify the consistency of subsequent retransmissions consume inordinate amounts of memory on highspeed links. On the other hand, normalizers that buffer only the hashes of unacknowledged segments cannot verify the consistency of 20-30% of retransmissions that, according to our traces, do not align with the original transmissions. This paper presents the design of RoboNorm, a normalizer that buffers only the hashes of unacknowledged segments, and yet can detect all inconsistent retransmissions in any TCP byte stream. RoboNorm consumes 1-2 orders of magnitude less memory than normalizers that buffers all unacknowledged data, and is amenable to a high-speed implementation. RoboNorm is also robust to attacks that attempt to compromise its operation or exhaust its resources.
Security and Privacy, 2008. SP 2008. IEEE Symposium on
Date of Conference: 18-22 May 2008