Demo of Megatron-CTRL. Text generation adapts seamlessly to dynamic human input through keywords. We also add retrieval from external knowledge base to improve consistency.pic.twitter.com/S0yy5rOZ1z
-
-
Show this thread
-
91% of our stories from Megatron-CTRL are successfully controlled by new keywords and 93% are consistent, from Mturk evaluations. This builds on Megatron project from
@NVIDIAAI where we trained 8 billion model using model parallelism on 512#GPU https://arxiv.org/abs/1909.08053Show this thread
End of conversation
New conversation -
-
-
Interesting work.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Very interesting, but not real-time. I've often wondered if something like Wikipedia could be an example of "external knowledge" but even that can be controversial, and not up to date.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Would this be synonymous with "Context Stuffing" in GPT3? "Finally, a generator takes the story context, as well as the top-ranked knowledge sentences as input, and generates the next sentence in the story. The output sentence is appended to the story context ... "
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.