Abstract:
This article proposes high-speed channel transformer (HSCT), a transformer network-based signal integrity (SI) simulator for high-speed channels. Attention-based transfor...Show MoreMetadata
Abstract:
This article proposes high-speed channel transformer (HSCT), a transformer network-based signal integrity (SI) simulator for high-speed channels. Attention-based transformer networks are implemented to estimate characteristic impedance and frequency responses, including insertion loss, near-end crosstalk, and far-end crosstalk, given the input design parameters of differential channels. Unlike previous neural networks (NNs) for SI simulation, pretrained transformer networks are scalable and thus can estimate the frequency responses regardless of the number of frequency points within the trained bandwidth. Thanks to this scalability, training times can be dramatically reduced because HSCTs trained on the smaller scale can respond to predict larger-scale problems. This scalability can be achieved due to their shared weight property, long-term dependency of the embedded node, and training NNs in randomly sampled frequency points. The proposed HSCTs are validated in terms of both accuracy and scalability. Compared with previous sequence-to-sequence networks, the HSCTs achieved a 1% error rate for all the SI characteristics while ×25 scaling the number of frequency points from 40 to 961. Moreover, the training time is reduced by up to 97.8%.
Published in: IEEE Transactions on Electromagnetic Compatibility ( Volume: 66, Issue: 6, December 2024)