11---
22layout: particles_header
3- title: Graph-based Processing of Spatiotemporal Time Series
3+ title: Graph Deep Learning for Spatiotemporal Time Series
44lead: Forecasting, Reconstruction and Analysis
55description: The GMLG tutorial on graph deep learning for time-series processing.
66venue: ECML PKDD 2023
7676 </ div >
7777 </ div >
7878</ section >
79+ <!-- Material section-->
80+ < section >
81+ < div class ="container px-4 ">
82+ < div class ="row gx-4 justify-content-center ">
83+ < div class ="col-lg-8 text-center ">
84+ < h1 > Material</ h1 >
85+ < p > Download the slides used in our tutorial.</ p >
86+ < a href ="./gdl4sts_handout.pdf ">
87+ < img src ="{{site.url}}/assets/img/presentation-thumb.png " height ="82px " />
88+ </ a >
89+ </ div >
90+ </ div >
91+ </ div >
92+ </ section >
7993<!-- Program section-->
8094< section class ="bg-light ">
8195 < div class ="container px-4 ">
@@ -85,34 +99,31 @@ <h1 class="text-center">Program</h1>
8599 < h5 class ="mt-3 mb-2 fw-light "> < span class ="fw-bold me-2 "> Part 1</ span > Graph-based processing of
86100 spatiotemporal time series.</ h5 >
87101 < ol >
88- < li > < strong > Spatiotemporal time series with graph-side information </ strong > < br >
102+ < li > < strong > Spatiotemporal time series</ strong > < br >
89103 Definition of the problem settings. Introduction to common downstream tasks: forecasting and
90104 imputation.
91105 </ li >
92106 < li > < strong > Spatiotemporal graph neural networks (STGNNs)</ strong > < br >
93107 Presentation of the fundamental components of the general STGNN family of deep learning models
94- for STS. Recipes and strategies for building effective STGNNs are provided.
108+ for STS. Recipes and strategies for building effective STGNNs, as well as architectures from the
109+ literature, are provided.
95110 </ li >
96- < li > < strong > Global and local spatiotemporal models</ strong > < br >
111+ < li > < strong > Global and local models</ strong > < br >
97112 Discussion on the problem of local effects in spatiotemporal data. Review of the global and
98113 local modeling paradigms with their strengths and practical implications.
99114 </ li >
100- < li > < strong > Forecasting</ strong > < br >
101- Overview of model architectures from the literature.
115+ < li > < strong > Model quality assessment</ strong > < br >
116+ Identification of time-space regions, e.g., specific sensors or periods of time, where
117+ predictions can be improved.
102118 </ li >
103119 </ ol >
104120
105121 < h5 class ="mt-3 mb-2 fw-light "> < span class ="fw-bold me-2 "> Part 2</ span > Challenges and tools.</ h5 >
106122 < ol >
107- < li > < strong > Graph learning</ strong > < br >
123+ < li > < strong > Latent graph learning</ strong > < br >
108124 Why and how to learn a graph structure from data when relational information is unavailable,
109125 insufficient or unreliable.
110126 </ li >
111- < li > < strong > Statistical tools to test the optimality of predictive
112- models</ strong > < br >
113- Identification of time-space regions, e.g., specific sensors or periods of time, where
114- predictions can be improved.
115- </ li >
116127 < li > < strong > Learning in non-stationary environments</ strong > < br >
117128 Challenges and methods associated with modeling the evolution of spatiotemporal systems over
118129 time.
@@ -125,12 +136,11 @@ <h5 class="mt-3 mb-2 fw-light"><span class="fw-bold me-2">Part 2</span> Challeng
125136 multivariate
126137 time series imputation.
127138 </ li >
128- < li > < strong > Software</ strong > < br >
129- Overview of open-source Pytorch libraries for graph-based spatiotemporal data processing and
130- short demo
131- with Torch Spatiotemporal.
132- </ li >
133139 </ ol >
140+
141+ < h5 class ="mt-3 mb-2 fw-light "> < span class ="fw-bold me-2 "> Demo</ span > Coding Spatiotemporal GNNs.</ h5 >
142+ < p > Overview of open-source Pytorch libraries for graph-based spatiotemporal data processing and short demo
143+ with Torch Spatiotemporal.</ p >
134144 </ div >
135145 </ div >
136146 </ div >
0 commit comments