Conversation

Love this example via @thepromptking - this whole thing is a prompt (when training chatbots it's often necessary to give them a whole bunch of example dialogue)
Quote Tweet
Wow prompt engineering. From the Gopher paper: "we include the complete prompt used to condition Gopher towards dialogue ...this prompt consumes roughly 800 tokens of the 2048-token context... In practice this leaves plenty of room for subsequent dialogue." this is the prompt:
Show this thread
Image
2
13
This one is clever: teach GPT-3 how to handle more complex questions by showing it an example that both asks and answers relevant follow-up questions
Prompt:

Question: When was the founder of craigslist born?
Are follow up questions needed here: Yes.
Follow up: Who was the founder of craigslist?
Intermediate answer: Craigslist was founded by Craig Newmark.
Follow up: When was Craig Newmark born?
Intermediate answer: Craig Newmark was born on December 6, 1952.
So the final answer is: December 6, 1952
Question: What is the capital of the country where the Taj Mahal is at?
Are follow up questions needed here: Yes.
Follow up: 

GPT-3 response starts here:

Where is the Taj Mahal?
Intermediate answer: The Taj Mahal is in Agra, India.
Follow up: What is the capital of India?
Intermediate answer: The capital of India is New Delhi.
So the final answer is: New Delhi
Quote Tweet
Embedded video
2:19
We've found a new way to prompt language models that improves their ability to answer complex questions Our Self-ask prompt first has the model ask and answer simpler subquestions. This structure makes it easy to integrate Google Search into an LM. Watch our demo with GPT-3 🧵⬇️
Show this thread
1
10
Replying to and
Sorry no URL, but there are some examples of image/text interplay eg taking phrases used in DALLE/Stable Diffusion ("art deco"), then goad gpt-3 into explaining their meaning wrt style, content, materials, topics etc., + using that info for improved age prompting. Human v in loop
5
Replying to and
That is one heck of a prompt. Would be nice if this type of prompt could be applied through a fine-tune rather than for each invocation, but besides the issue of whether that can actually be done there is also the issue of reuse?