Incremental General Non-negative Matrix Factorization without Dimension Matching Constraints
In this paper, we propose a General Non-negative Matrix Factorization based on the left Semi-Tensor Product (lGNMF) and the General Non-negative Matrix Factorization based on the right Semi-Tensor Product (rGNMF), which factorize an input non-negative matrix into two non-negative matrices of lower ranks based on gradient method. In particular, the proposed models are able to remove the dimension matching constraints required by conventional NMF models. Both theoretical derivation and experimental results show that the conventional NMF is a special case of the proposed lGNMF and rGNMF. We find the method for the best efficacy of the image restoration in lGNMF and rGNMF by experiments on baboon and lenna images. Moreover, inspired by the Incremental Non-negative Matrix Factorization (INMF), we propose the Incremental lGNMF (IlGNMF) and Incremental rGNMF (IrGNMF), We also conduct the experiments on JAFFE database and ORL database, and find that IlGNMF and IrGNMF realize saving storage space and reducing computation time in incremental facial training.
Chen, Z., Li, L., Peng, H., Liu, Y., & Yang, Y. (2018). Incremental general non-negative matrix factorization without dimension matching constraints. Neurocomputing, 311, 344–352. https://doi.org/10.1016/j.neucom.2018.05.067