Unified Insights into Graph Neural Networks: From Connectivity to Linear Architectures, and Time Series Applications
  • Author(s): Pratham Taneja ; Sanskar Saxena ; Divyansh Singh ; Aditya Chauhan ; Bhaumik Tyagi; Garv Kalia
  • Paper ID: 1705196
  • Page: 71-82
  • Published Date: 17-11-2023
  • Published In: Iconic Research And Engineering Journals
  • Publisher: IRE Journals
  • e-ISSN: 2456-8880
  • Volume/Issue: Volume 7 Issue 5 November-2023
Abstract

The predominant data modality employed for the documentation of dynamic system measurements is time series, which emanates prolifically from physical sensors and online processes (virtual sensors). The imperative role of time series analytics in extracting the inherent wealth of information from available data is underscored by recent advancements in graph neural networks (GNNs). These networks have witnessed a notable surge in their application for time series analysis, owing to their capacity to explicitly model inter-temporal and inter-variable relationships—attributes that traditional and other deep neural network-based methodologies find challenging. This review endeavors to comprehensively review graph neural networks for time series analysis (GNN4TS), encompassing four fundamental dimensions: forecasting, classification, anomaly detection, and imputation. The overarching objective is to serve as a guiding resource for designers and practitioners, facilitating an enhanced understanding, application development, and progression of research within the domain of GNN4TS.Linear Architectures are also discussed in this very research. Neural Architecture Search (NAS) has exhibited notable advancements in optimizing Graph Neural Networks (GNNs), denoted as NAS-GNNs, outperforming manually designed GNN architectures. Nevertheless, challenges inherited from conventional NAS methods, including elevated computational costs and optimization complexities, persist. Notably, prior NAS approaches have tended to overlook the inherent characteristics of GNNs, which inherently possess expressive power without the necessity for training. Notably, NAC achieves up to a 200× acceleration in computational efficiency and a 19.9% enhancement in accuracy compared to robust baseline methods.

Keywords

Graph neural networks, Linear architecture search, Time series Applications.

Citations

IRE Journals:
Pratham Taneja , Sanskar Saxena , Divyansh Singh , Aditya Chauhan , Bhaumik Tyagi; Garv Kalia "Unified Insights into Graph Neural Networks: From Connectivity to Linear Architectures, and Time Series Applications" Iconic Research And Engineering Journals Volume 7 Issue 5 2023 Page 71-82

IEEE:
Pratham Taneja , Sanskar Saxena , Divyansh Singh , Aditya Chauhan , Bhaumik Tyagi; Garv Kalia "Unified Insights into Graph Neural Networks: From Connectivity to Linear Architectures, and Time Series Applications" Iconic Research And Engineering Journals, 7(5)