𝗗𝗮𝘆-𝟯𝟰𝟵 𝗖𝗼𝗺𝗽𝘂𝘁𝗲𝗿 𝗩𝗶𝘀𝗶𝗼𝗻 𝗟𝗲𝗮𝗿𝗻𝗶𝗻𝗴 Google AI Proposes Temporal Fusion Transformer (TFT): An Attention-Based DNN (Deep Neural Network) Model For Multi-Horizon Forecasting Follow me for a similar post: 🇮🇳 Ashish Patel ------------------------------------------------------------------- 𝗜𝗻𝘁𝗲𝗿𝗲𝘀𝘁𝗶𝗻𝗴 𝗙𝗮𝗰𝘁𝘀 : 🔸 Paper: Temporal Fusion Transformer (TFT): An Attention-Based DNN (Deep Neural Network) Model For Multi-Horizon Forecasting 🔸 This paper is published in arxiv2021. 🔸 A new Google research proposes the Temporal Fusion Transformer (TFT), an attention-based DNN model for multi-horizon forecasting. TFT is built to explicitly align the model with the broad multi-horizon forecasting job, resulting in greater accuracy and interpretability across a wide range of applications. ------------------------------------------------------------------- 𝗜𝗠𝗣𝗢𝗥𝗧𝗔𝗡𝗖𝗘 TFT is designed to efficiently create feature representations for each input type (i.e., static, known, or observed inputs). Its major components include: 🔥 Gating mechanisms: This helps to skip over any unneeded model components, allowing for flexible depth and network complexity to suit a wide range of datasets 🔥Variable selection networks: At each time step, it provides a selection of important input variables. While traditional DNNs may overfit irrelevant features, attention-based variable selection can help enhance generalization by pushing the model to focus the majority of its learning capacity on the most important feature. 🔥Static covariate encoders: It incorporates static features to regulate the modeling of temporal dynamics. 🔥Temporal processing: This is to learn both long- and short-term temporal associations using time-varying inputs that are both observed and known. Local processing is handled by a sequence-to-sequence layer, which benefits from its inductive bias for ordered information processing. On the other hand, long-term dependencies are handled by a unique interpretable multi-head attention block. This can shorten the effective path length of information, as any previous time step containing relevant data can be targeted immediately. 🔥Prediction intervals display quantile predictions to determine the range of goal values at each prediction horizon. This aids users in comprehending the output distribution rather than just the point forecasts. ------------------------------------------------------------------- #computervision #artificialintelligence #innovation -------------------------------------------------------------------
If you want a deeper explanation for how this awesome model works, check this post!! https://towardsdatascience.com/temporal-fusion-transformer-googles-model-for-interpretable-time-series-forecasting-5aa17beb621?sk=023993f7a9d124cf5b9a5fbb09cd5383