I *think* this is the most concise & readable seq2seq Transformer you'll find anywhere (that doesn't involve a pre-made model).
-
-
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Any plans for genomics examples with keras?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
François, Thanks for this. Curious if transformers could be used to 'complete the cart.' i.e. Given two items in a retail cart, predict the next two. I think look something like your example here:https://keras.io/examples/nlp/addition_rnn/ …
-
Sure, it could work. But a cart is more a set than a sequence, and with only 2-4 elements, you don't really need a Transformer for this. A simple MLP would work.
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.