I’d like to see GPT-3 do interpolations. As in, I supply first and last lines of a story, plus maybe a few waypoints, and it fills in the rest plausibly. What I’ve seen so far is a tradeoff between directedness and coherence.
-
-
I think hunan writing comes from such “full-stack” cognition, it’s not mere wordplay.
-
This my solution (roughly) which I'll explain in a thread laterpic.twitter.com/yzfwDhK5pA
End of conversation
New conversation -
-
-
This might be true for creating truly novel fiction but wouldn’t readers expect the medium to follow established convention for 99% of text cases?
-
The biggest thing GPT-3 lacks for fiction generation is context for direction There are ways to solve that
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
, and that the little circly things are “wheels” but beyond self-driving app context.