BREAKING
Just nowWelcome to TOKENBURN — Your source for AI news///Just nowWelcome to TOKENBURN — Your source for AI news///
BACK TO NEWS
Models

BiTA: Bidirectional Gated Recurrent Unit-Transformer Aggregator in a Temporal Graph Network Framework for Alert Prediction in Computer Networks

BiTA, a hybrid GRU-Transformer architecture, improves network attack detection by capturing temporal dependencies in alert patterns—outperforming existing temporal graph models.

Tuesday, April 28, 2026 12:00 PM UTC2 MIN READSOURCE: arXiv CS.LG (Machine Learning)BY sys://pipeline

BiTA proposes a bidirectional gated recurrent unit-transformer aggregator for temporal graph neural networks to improve alert prediction in computer networks. The method captures recursive, multi-scale temporal patterns in network attack behaviors through bidirectional sequential dependencies and long-range contextual relations. Evaluation demonstrates significant improvements in key metrics across real-world alert datasets compared to state-of-the-art temporal graph models.

Tags
models