Skip to content

Commit 0c28768

Browse files
committed
added preprint manenti2024learning
1 parent eedcad6 commit 0c28768

2 files changed

Lines changed: 41 additions & 12 deletions

File tree

_data/news.yml

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,9 @@
11
- date: 2024/05
22
text: 'Two papers accepted at <strong>ICML 2024</strong>! <a href="https://arxiv.org/abs/2305.19183">Graph-based Time Series Clustering for End-to-End Hierarchical Forecasting</a> (Cini et al.) and <a href="https://arxiv.org/abs/2402.10634">Graph-based Forecasting with Missing Data through Spatiotemporal Downsampling</a> (Marisca et al.).'
3+
- date: 2024/04
4+
text: 'Paper <a href="https://arxiv.org/abs/2404.19508">Temporal Graph ODEs for Irregularly-Sampled Time Series</a> has been accepted at <strong>IJCAI 2024</strong>!'
5+
- date: 2024/01
6+
text: 'Our paper <a href="https://openreview.net/forum?id=CAqdG2dy5s">Graph-based Virtual Sensing from Sparse and Partial Multivariate Observations</a> has been accepted at <strong>ICLR 2024</strong>!'
37
- date: 2023/12
48
text: 'Submit by Jan. 15 to our <strong>special sessions</strong> <a href="https://sites.google.com/view/dl4g-2024">Deep Learning for Graphs</a> at IEEE WCCI 2024 in Yokohama, Japan (Jun. 30-Jul. 5).'
59
- date: 2023/09

_data/publications.yaml

Lines changed: 37 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,29 @@
11
---
2+
- title: 'Learning Latent Graph Structures and their Uncertainty'
3+
links:
4+
paper: https://arxiv.org/abs/2405.19933
5+
venue: preprint
6+
year: 2024
7+
authors:
8+
- id:amanenti
9+
- id:dzambon
10+
- id:calippi
11+
keywords:
12+
- graph structure learning
13+
- graph neural networks
14+
- model calibration
15+
abstract: Within a prediction task, Graph Neural Networks (GNNs) use relational information as an inductive bias to enhance the model's accuracy. As task-relevant relations might be unknown, graph structure learning approaches have been proposed to learn them while solving the downstream prediction task. In this paper, we demonstrate that minimization of a point-prediction loss function, e.g., the mean absolute error, does not guarantee proper learning of the latent relational information and its associated uncertainty. Conversely, we prove that a suitable loss function on the stochastic model outputs simultaneously grants (i) the unknown adjacency matrix latent distribution and (ii) optimal performance on the prediction task. Finally, we propose a sampling-based method that solves this joint learning task. Empirical results validate our theoretical claims and demonstrate the effectiveness of the proposed approach.
16+
bibtex: >
17+
@misc{manenti2024learning,
18+
title = {Learning {{Latent Graph Structures}} and Their {{Uncertainty}}},
19+
author = {Manenti, Alessandro and Zambon, Daniele and Alippi, Cesare},
20+
year = {2024},
21+
month = may,
22+
number = {arXiv:2405.19933},
23+
primaryclass = {cs, stat},
24+
publisher = {arXiv},
25+
archiveprefix = {arxiv}
26+
}
227
- title: 'Temporal Graph ODEs for Irregularly-Sampled Time Series'
328
links:
429
paper: https://arxiv.org/abs/2404.19508
@@ -39,7 +64,7 @@
3964
- spatiotemporal graphs
4065
- graph neural networks
4166
- imputation
42-
- graph learning
67+
- graph structure learning
4368
abstract: Virtual sensing techniques allow for inferring signals at new unmonitored locations by exploiting spatio-temporal measurements coming from physical sensors at different locations. However, as the sensor coverage becomes sparse due to costs or other constraints, physical proximity cannot be used to support interpolation. In this paper, we overcome this challenge by leveraging dependencies between the target variable and a set of correlated variables (covariates) that can frequently be associated with each location of interest. From this viewpoint, covariates provide partial observability, and the problem consists of inferring values for unobserved channels by exploiting observations at other locations to learn how such variables can correlate. We introduce a novel graph-based methodology to exploit such relationships and design a graph deep learning architecture, named GgNet, implementing the framework. The proposed approach relies on propagating information over a nested graph structure that is used to learn dependencies between variables as well as locations. GgNet is extensively evaluated under different virtual sensing scenarios, demonstrating higher reconstruction accuracy compared to the state-of-the-art.
4469
bibtex: >
4570
@inproceedings{defelice2024graphbased,
@@ -183,7 +208,7 @@
183208
keywords:
184209
- spatiotemporal graphs
185210
- state-space models
186-
- graph learning
211+
- graph structure learning
187212
abstract: The well-known Kalman filters model dynamical systems by relying on state-space representations with the next state updated, and its uncertainty controlled, by fresh information associated with newly observed system outputs. This paper generalizes, for the first time in the literature, Kalman and extended Kalman filters to discrete-time settings where inputs, states, and outputs are represented as attributed graphs whose topology and attributes can change with time. The setup allows us to adapt the framework to cases where the output is a vector or a scalar too (node/graph level tasks). Within the proposed theoretical framework, the unknown state-transition and the readout functions are learned end-to-end along with the downstream prediction task.
188213
- title: Taming Local Effects in Graph-based Spatiotemporal Forecasting
189214
links:
@@ -232,14 +257,14 @@
232257
year: 2023
233258
bibtex: >
234259
@article{efkarpidis2023peak,
235-
title={Peak shaving in distribution networks using stationary energy storage systems: A Swiss case study},
236-
author={Efkarpidis, Nikolaos A and Imoscopi, Stefano and Geidl, Martin and Cini, Andrea and Lukovic, Slobodan and Alippi, Cesare and Herbst, Ingo},
237-
journal={Sustainable Energy, Grids and Networks},
238-
volume={34},
239-
pages={101018},
240-
year={2023},
241-
publisher={Elsevier}
242-
}
260+
title={Peak shaving in distribution networks using stationary energy storage systems: A Swiss case study},
261+
author={Efkarpidis, Nikolaos A and Imoscopi, Stefano and Geidl, Martin and Cini, Andrea and Lukovic, Slobodan and Alippi, Cesare and Herbst, Ingo},
262+
journal={Sustainable Energy, Grids and Networks},
263+
volume={34},
264+
pages={101018},
265+
year={2023},
266+
publisher={Elsevier}
267+
}
243268
authors:
244269
- N. A. Efkarpidis
245270
- id:simoscopi
@@ -265,7 +290,7 @@
265290
keywords:
266291
- spatiotemporal graphs
267292
- state-space models
268-
- graph learning
293+
- graph structure learning
269294
abstract: State-space models constitute an effective modeling tool to describe multivariate time series and operate by maintaining an updated representation of the system state from which predictions are made. Within this framework, relational inductive biases, e.g., associated with functional dependencies existing among signals, are not explicitly exploited leaving unattended great opportunities for effective modeling approaches. The manuscript aims, for the first time, at filling this gap by matching state-space modeling and spatio-temporal data where the relational information, say the functional graph capturing latent dependencies, is learned directly from data and is allowed to change over time. Within a probabilistic formulation that accounts for the uncertainty in the data-generating process, an encoder-decoder architecture is proposed to learn the state-space model end-to-end on a downstream task. The proposed methodological framework generalizes several state-of-the-art methods and demonstrates to be effective in extracting meaningful relational information while achieving optimal forecasting performance in controlled environments.
270295
- title: Scalable Spatiotemporal Graph Neural Networks
271296
links:
@@ -418,7 +443,7 @@
418443
- id:calippi
419444
keywords:
420445
- spatiotemporal graphs
421-
- graph learning
446+
- graph structure learning
422447
- forecasting
423448
abstract: Outstanding achievements of graph neural networks for spatiotemporal time series analysis show that relational constraints introduce an effective inductive bias into neural forecasting architectures. Often, however, the relational information characterizing the underlying data-generating process is unavailable and the practitioner is left with the problem of inferring from data which relational graph to use in the subsequent processing stages. We propose novel, principled - yet practical - probabilistic score-based methods that learn the relational dependencies as distributions over graphs while maximizing end-to-end the performance at task. The proposed graph learning framework is based on consolidated variance reduction techniques for Monte Carlo score-based gradient estimation, is theoretically grounded, and, as we show, effective in practice. In this paper, we focus on the time series forecasting problem and show that, by tailoring the gradient estimators to the graph learning problem, we are able to achieve state-of-the-art performance while controlling the sparsity of the learned graph and the computational scalability. We empirically assess the effectiveness of the proposed method on synthetic and real-world benchmarks, showing that the proposed solution can be used as a stand-alone graph identification procedure as well as a graph learning component of an end-to-end forecasting architecture.
424449
bibtex: >

0 commit comments

Comments
 (0)