Cognitive Computation | 2019

Biological Neuron Coding Inspired Binary Word Embeddings

 
 
 
 

Abstract


Word embeddings are the semantic representations of the words. They are derived from large corpus and work well on many natural language tasks, with one downside of costing large memory space. In this paper, we propose binary word embedding models based on inspirations from biological neuron coding mechanisms, converting the spike timing of neurons during specific time intervals into binary codes, reducing the space and speeding up computation. We build three types of models to post-process the original dense word embeddings, namely, the homogeneous Poission processing-based rate coding model, the leaky integrate-and-fire neuron-based model, and the Izhikevich’s neuron-based model. We test our binary embedding models on word similarity and text classification tasks of five public datasets. The experimental results show that the brain-inspired binary word embeddings (which reduce approximately 68.75% of the space) get similar results to original embeddings for word similarity task while better performance than traditional binary embeddings on text classification task.

Volume 11
Pages 676 - 684
DOI 10.1007/s12559-019-09643-1
Language English
Journal Cognitive Computation

Full Text