Key features: - Supports sparse outputs (int sequences), to be fed into an Embedding layer - Supports dense outputs (binary, tf-idf, count) - Built-in ngram generation Full credits to Mark Omernick for the code example and doing much of the work on this project.
-
-
Show this thread
-
Such a layer makes your text-processing model end-to-end: ingests strings, outputs classes/etc. You can deploy your model without worrying about the external preprocessing pipeline.
Show this thread
End of conversation
New conversation -
-
-
Does it take a list of strings, or a list of lists?
#realQuestions -
It takes a tf.string tensor or ragged tensor.
End of conversation
New conversation -
-
-
awesome! Reminds me of some of gensim’s preprocessing features
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This is slick


Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Thank U!!!!!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This is beautiful! Im glad we're moving toward a lot of preprocessing and data sanitation steps being done as layers.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I love this!!!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
This is a nice improvement, though I hope Keras team is looking at Lucene/SOLR for more good ideas on auto-processed text fields.
@gsingersThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.