A recent discussion from Reddit:https://www.reddit.com/r/MachineLearning/comments/dfky70/discussion_exfiltrating_copyright_notices_news/ …
-
-
-
thanks.
End of conversation
New conversation -
-
-
The GPT2 paper itself has some analysis on this!pic.twitter.com/jdAD6BQTiB
-
"model copies speech for a while before drifting" - sounds about right
- 3 more replies
New conversation -
-
-
I have no studied it, but here's my take on it from a few months ago:https://twitter.com/fchollet/status/1129129679955169280 …
- 2 more replies
New conversation -
-
-
People have managed to make it spit out large chunks of text word for word from the training data
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
You need to look at temperature and top_k settings. https://github.com/openai/gpt-2/blob/master/src/interactive_conditional_samples.py#L32 …https://github.com/openai/gpt-2/blob/master/src/generate_unconditional_samples.py#L36 …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Section 4 of the GPT-2 paper. Definitely worth a read.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Maybe one for
@gdb or@Miles_Brundage perhapsThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.