Fractional Tensor Recurrent Unit (fTRU): A Stable Forecasting Model With Long Memory
Published in IEEE Transactions on Neural Networks and Learning Systems, 2023
Our new model, named fractional tensor recurrent unit (fTRU), is expected to seek the saddle point between long memory property and model stability during the training. We experimentally show that the proposed model achieves competitive performance with a long memory and stable manners in several forecasting tasks compared to various advanced RNNs.
Recommended citation: H. Qiu, C. Li, Y. Weng, Z. Sun and Q. Zhao, "Fractional Tensor Recurrent Unit (fTRU): A Stable Forecasting Model With Long Memory," in IEEE Transactions on Neural Networks and Learning Systems, doi: 10.1109/TNNLS.2023.3338696. https://ieeexplore.ieee.org/document/10361837