Reminder: language serves a variety of purposes -- transmit information, act on the world to achieve specific goals, serve as a social lubricant, etc. Language cannot be modeled as a statistical distribution independent of these purposes.
-
-
Show this thread
-
This is akin to modeling the appearance of animals as a statistical distribution while ignoring the environment in which they live. You could use such a model to generate plausible-looking animals, but don't expect them to be able to survive in the wild (environmental fitness)
Show this thread -
Animals evolved to fit their environment -- everything about them (us) is a product of environmental constraints. Likewise language is a construct evolved to fit a specific set of functions, and you cannot model it independently from this context.
Show this thread
End of conversation
New conversation -
-
-
Seems like an unfair test. A person couldn't give accurate troubleshooting for some random product just based on a few prompts.
-
Currently GPT-3 can't be trained on existing support conversations or docs to "learn" about the product or business. My understanding is the OpenAI team is working on the ability for customers to do that though which would make it a lot more useful for this use case.
- Show replies
New conversation -
-
-
The problem is people falling for the hype train and assuming GPT-3 has really learnt something and can "replace" humans in tasks. If you change the question to, Could GPT-3 "help" write your customer service answers, then you will get a different answer to the question.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.