Skip to content

Commit 2306a16

Browse files
committed
added survey paper Jin et al
1 parent 046046d commit 2306a16

1 file changed

Lines changed: 25 additions & 0 deletions

File tree

_data/publications.yaml

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,27 @@
2929
keywords:
3030
- graph neural networks
3131
abstract: In a broad range of real-world machine learning applications, representing examples as graphs is crucial to avoid a loss of information. Due to this in the last few years, the definition of machine learning methods, particularly neural networks, for graph-structured inputs has been gaining increasing attention. In particular, Deep Graph Networks (DGNs) are nowadays the most commonly adopted models to learn a representation that can be used to address different tasks related to nodes, edges, or even entire graphs. This tutorial paper reviews fundamental concepts and open challenges of graph representation learning and summarizes the contributions that have been accepted for publication to the ESANN 2023 special session on the topic.
32+
- title: "A Survey on Graph Neural Networks for Time Series: Forecasting, Classification, Imputation, and Anomaly Detection"
33+
links:
34+
paper: https://arxiv.org/abs/2307.03759
35+
venue: Preprint
36+
year: 2023
37+
authors:
38+
- M. Jin
39+
- H. Y. Koh
40+
- Q. Wen
41+
- id:dzambon
42+
- id:calippi
43+
- G. I. Webb
44+
- I. King
45+
- S. Pan
46+
keywords:
47+
- spatiotemporal graphs
48+
- graph neural networks
49+
- forecasting
50+
- imputation
51+
- anomaly detection
52+
abstract: "Time series are the primary data type used to record dynamic system measurements and generated in great volume by both physical sensors and online processes (virtual sensors). Time series analytics is therefore crucial to unlocking the wealth of information implicit in available data. With the recent advancements in graph neural networks (GNNs), there has been a surge in GNN-based approaches for time series analysis. These approaches can explicitly model inter-temporal and inter-variable relationships, which traditional and other deep neural network-based methods struggle to do. In this survey, we provide a comprehensive review of graph neural networks for time series analysis (GNN4TS), encompassing four fundamental dimensions: forecasting, classification, anomaly detection, and imputation. Our aim is to guide designers and practitioners to understand, build applications, and advance research of GNN4TS. At first, we provide a comprehensive task-oriented taxonomy of GNN4TS. Then, we present and discuss representative research works and introduce mainstream applications of GNN4TS. A comprehensive discussion of potential future research directions completes the survey. This survey, for the first time, brings together a vast array of knowledge on GNN-based time series research, highlighting foundations, practical applications, and opportunities of graph neural networks for time series analysis."
3253
- title: Graph-based Time Series Clustering for End-to-End Hierarchical Forecasting
3354
links:
3455
paper: https://arxiv.org/abs/2305.19183
@@ -92,6 +113,7 @@
92113
- title: Taming Local Effects in Graph-based Spatiotemporal Forecasting
93114
links:
94115
paper: https://arxiv.org/abs/2302.04071
116+
code: https://github.com/Graph-Machine-Learning-Group/taming-local-effects-stgnns
95117
venue: Advances in Neural Information Processing Systems
96118
year: 2023
97119
authors:
@@ -280,6 +302,7 @@
280302
- title: Sparse Graph Learning from Spatiotemporal Time Series
281303
links:
282304
paper: https://jmlr.org/papers/v24/22-1154.html
305+
code: https://github.com/andreacini/sparse-graph-learning
283306
venue: Journal of Machine Learning Research
284307
year: 2023
285308
authors:
@@ -1001,6 +1024,8 @@
10011024
code: https://github.com/danielegrattarola/ccm-aae
10021025
venue: Applied Soft Computing
10031026
year: 2019
1027+
keywords:
1028+
- graph neural networks
10041029
authors:
10051030
- id:dgrattarola
10061031
- id:llivi

0 commit comments

Comments
 (0)