Evaluate Transformer model and Self-Attention mechanism in the Yangtze River basin runoff prediction
Study region: In the Yangtze River basin of China. Study focus: We applied a recently popular deep learning (DL) algorithm, Transformer (TSF), and two commonly used DL methods, Long-Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), to evaluate the performance of TSF in predicting runoff in the Yangtze River basin. We also add the main structure of TSF, Self-Attention (SA), to the LSTM and G