New example on http://keras.io : a neural machine translation example with a seq2seq Transformer. Starts from raw data, and includes data preparation, building the Transformer (from scratch!) & training it, and inference. Less than 200 lines total.https://keras.io/examples/nlp/neural_machine_translation_with_transformer/ …
Sure, it could work. But a cart is more a set than a sequence, and with only 2-4 elements, you don't really need a Transformer for this. A simple MLP would work.
-
-
Gotcha. I work on big carts, and there's a time element too, so someone might purchase 100-1000 items over a week or month. Item order is somewhat fungible same day, but less so over time.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.