New blog post: using pre-trained word embeddings in Keras to solve a text classification problem http://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html …
-
-
feel free to run the code on the "official" train/test split. I think it's likely to be better than what Andrew &Quoc tried
-
a deep convnet with global max pooling is a very strong sequence classification model, generally outperforms LSTM+attention
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.