Affiliations: Department of Computer Science, University of Pisa, Largo B. Pontecorvo, Pisa, Italy
Correspondence:
[*]
Corresponding author: Alessio Micheli, Department of Computer Science, University of Pisa, Largo B. Pontecorvo
3, 56127 Pisa, Italy. E-mail: micheli@di.unipi.it.
Abstract: Recurrent Neural Networks (RNNs) represent a natural paradigm for modeling sequential data like text written in natural language. In fact, RNNs and their variations have long been the architecture of choice in many applications, however in practice they require the use of labored architectures (such as gating mechanisms) and computationally heavy training processes. In this paper we address the question of whether it is possible to generate sentence embeddings via completely untrained recurrent dynamics, on top of which to apply a simple learning algorithm for text classification. This would allow to obtain extremely efficient models in terms of training time. Our work investigates the extent to which this approach can be used, by analyzing the results on different tasks. Finally, we show that, within certain limits, it is possible to build extremely efficient models for text classification that remain competitive in accuracy with reference models in the state-of-the-art.
Keywords: Text classification, sentence embeddings, recurrent neural networks, echo
state networks