"talking to the void" can feel freeing bc expressing conscious state into tokenized thought is an act of exploration. chaotic territory is transmuted into orderly map u r walking into the dessert but u feel stationary b/c of lack of landmarkshttps://twitter.com/AbstractFairy/status/1280386973740470272?s=20 …
-
Show this thread
-
there is a fear that you will simply get lost, that there is no one else out there, or that there simply isn't anything worthwhile past the Edge beside sand and stormpic.twitter.com/4b3LHb8voO
1 reply 0 retweets 2 likesShow this thread -
🐈 Retweeted Adele Dewey-Lopez
gpt-3 acts like a highly saturated liquid. the crystal that is formed is dependent on the shape of the thing that is thrust into it it's not that different than an actual brain that reacts based on triggershttps://twitter.com/AdeleDeweyLopez/status/1411473967001395203?s=20 …
🐈 added,
1 reply 0 retweets 0 likesShow this thread -
when using a low temperate param for gpt-3, it likely acts less like crystal growth which has a lot of randomness & more like a bubble solution finding the deterministic(?) minimal surface of a wire framehttps://en.wikipedia.org/wiki/Minimal_surface#:~:text=.%20physical%20models%20of%20area-minimizing%20minimal%20surfaces%20can%20be%20made%20by%20dipping%20a%20wire%20frame%20into%20a%20soap%20solution%2C%20forming%20a%20soap%20film%2C%20which%20is%20a%20minimal%20surface%20whose%20boundary%20is%20the%20wire%20frame …
1 reply 0 retweets 0 likesShow this thread -
TLDR; hooman brains are no different than bubble solution or those clicky heat pack things
1 reply 0 retweets 2 likesShow this thread -
just saw github copilot described as fancy markov chain autocomplete which works for general usage as well
1 reply 0 retweets 0 likesShow this thread -
thinking about this more there may be a weakness in this model b/c language isn’t a static state independent of the reader it makes technical sense as an internal model just like u can consider the input/output a long alphanumeric string
1 reply 0 retweets 0 likesShow this thread -
but the actually useful understanding is a much smaller space. blank spaces demarcate words so a string of words is better. words express ideas so input/output as an “idea” is even better
1 reply 0 retweets 0 likesShow this thread -
gpt-3 after all is trained on “ideas of the internet” but this makes gpt-3 feel too much like a search engine. you put in an input to specify some kind of unique state, and then it gives you adjacent states like a librarian fetching a book
1 reply 0 retweets 1 likeShow this thread -
in practice, what is expressed is not a specific idea with an adjacency of other ideas but a STRUCTURE or a function that can take in new parameters and pump out entirely new results
1 reply 0 retweets 0 likesShow this thread
🐈 Retweeted 🐈
the similar structure in regular language is the analogy or metaphor. once you hear a good analogy, it's like you have a whole new tool to think with just like having a whole new function in codehttps://twitter.com/a_yawning_cat/status/1390158309202677763?s=20 …
🐈 added,
-
-
🐈 Retweeted 🐈
that is... once you create a "woo portal" gpt-3 can act as a stabilizer that keeps the portal open that would allow my army of demo... er... angels of lights and goodness to step throughhttps://twitter.com/a_yawning_cat/status/1341959274964996096?s=20 …
🐈 added,
0 replies 0 retweets 1 likeShow this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.