DEEP LEARNING–BASED INTRUSION DETECTION IN VEHICULAR NETWORKS: A REVIEW OF GATED RECURRENT UNIT APPROACHES
Abstract
With the intention of protecting road users and network facilities, notable security challenges that require robust intrusion detection systems (IDSs) have been identified with the growth of vehicular networks. This systematic review examines recent deep learning (DL) applications to assess their capability to identify anomaly-based intrusions in vehicular networks, with a focus on Gated Recurrent Unit (GRU) architectures. Following a structured literature search and screening protocol, peer-reviewed studies published between 2021 and 2025 were systematically identified, evaluated, and synthesized. GRU networks enable real-time detection with efficient computation and also show remarkable capacity to capture temporal dependencies and sequential patterns in network data. This systematic review finds that GRU-based systems can achieve better performance with fewer parameters and maintain low computational cost by effectively addressing the vanishing gradient issues in conventional RNNs. Some of the reported accuracies exceeded 99% across several benchmark datasets, including CICIDS2017, CICIDS2018, NSL-KDD, and UNSW-NB15. Hybrid GRU-CNN architectures routinely outperform traditional detection algorithms. The efficient utilization of GRUs in resource-constrained vehicle contexts is confirmed by performance tests using evaluation metrics such as precision, recall, F1 Score, and false positive rate (FPR). GRU ensembles, combined with other algorithms such as bidirectional LSTM networks, attention mechanisms, and optimization techniques, have improved detection capabilities against complex attacks such as DoS, blackhole, and zero-day exploits. The results show that GRU is crucial for developing an accurate, lightweight IDS that can be deployed to safeguard transportation infrastructure and current vehicular networks.
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Science World Journal

This work is licensed under a Creative Commons Attribution 4.0 International License.