Publications

Fractional Tensor Recurrent Unit (fTRU): A Stable Forecasting Model With Long Memory

Published in IEEE Transactions on Neural Networks and Learning Systems, 2023

Our new model, named fractional tensor recurrent unit (fTRU), is expected to seek the saddle point between long memory property and model stability during the training. We experimentally show that the proposed model achieves competitive performance with a long memory and stable manners in several forecasting tasks compared to various advanced RNNs.

Recommended citation: H. Qiu, C. Li, Y. Weng, Z. Sun and Q. Zhao, "Fractional Tensor Recurrent Unit (fTRU): A Stable Forecasting Model With Long Memory," in IEEE Transactions on Neural Networks and Learning Systems, doi: 10.1109/TNNLS.2023.3338696. https://ieeexplore.ieee.org/document/10361837

On the Memory Mechanism of Tensor-Power Recurrent Model

Published in International Conference on Artificial Intelligence and Statistics, 2021

Focus on the long-term memory and stability of tensor recurrent model with developing a degree-differentiable model benefit from long-term effect in a stable manner, cooperate with RIKEN AIP Tensor Learning Team.

Recommended citation: Qiu, H., Li, C., Weng, Y., Sun, Z., He, X. and Zhao, Q., 2021, March. On the Memory Mechanism of Tensor-Power Recurrent Models. In International Conference on Artificial Intelligence and Statistics (pp. 3682-3690). PMLR. http://proceedings.mlr.press/v130/qiu21a.html