At International Conference on Artificial Neural Networks (ICANN 2019) held in Munich, Germany, I presented following two topics.
“Variational Deep Embedding with Regularized Student-t Mixture Model”
“Continual Learning Exploiting Structure of Fractal Reservoir Computing”
The first topic proposed a new variational autoencoder (VAE) with regularized student-t mixture model as prior. In that case, VAE can be regarded as a classifier in the latent space, which is extracted by it. Problems are no closed-form solution, so I applied three types of approximation methods. As a result, it is now able to be backpropagated.
The second topic proposed a new reservoir computing structure with sparsely firing activation function, fractal network, and input gate. In that case, it has modules corresponding to given tasks, so it can easily memorize the tasks into the corresponding modules, and they would not be forgotten when the other tasks are conducted. As a result, the proposed structure outperformed the random network in continual learning benchmarks.