Here's another batch sampled from the same model, but with the top_k truncation set to 0 instead of 40. This allows the model to use lower-probability predictions. They are noticeably lower-probability.pic.twitter.com/QRae3Vy4UM
U tweetove putem weba ili aplikacija drugih proizvođača možete dodati podatke o lokaciji, kao što su grad ili točna lokacija. Povijest lokacija tweetova uvijek možete izbrisati. Saznajte više
Here's another batch sampled from the same model, but with the top_k truncation set to 0 instead of 40. This allows the model to use lower-probability predictions. They are noticeably lower-probability.pic.twitter.com/QRae3Vy4UM
It’s Hang in Wind! Humans cannot comprehend what it does.https://twitter.com/emersTweeting/status/1147691670059782145 …
Another batch of patterns, this time with truncation top_k = 60, a compromise that produces weird names but more-doable patterns than with truncation turned off.pic.twitter.com/c6joztWg9k
If I don't give it a specific prompt, sometimes the neural net starts generating things that it remembers from its original training. I love its recovery method: twist the conversation back toward crochet hats.pic.twitter.com/Wi5N4PvJAv
I’m not kidding about the all-conversations-lead-to-crochet-hats thing either. My prompt is the bit in bold. The neural net did the rest.pic.twitter.com/2DwWBSJ7ue
Meanwhile Brim Hat Pattern #1708 is revealing itself to be an all-devouring eldritch horrorhttps://twitter.com/persipan/status/1147750225752264706?s=21 …
And if you can wear Crochet Cap Pattern #3615, you are already lost to the mortal realmshttps://twitter.com/manyartsofem/status/1147888187609669632 …
Here are some more hats from truncation top_k=60. I suspect that some of these don't result in hats? I probably should add more truncation to rein in the creativity a bit, but I'm enjoying these names so much.pic.twitter.com/OdQ38sEQmf
Why do all the neural net's hats end up ballooning into universe-devouring monstrosities?https://twitter.com/joannastar/status/1147925989005176832 …
Decided to try a new method for inducing the neural net to generate interesting pattern titles: a conservative k=40 truncation but prompted with the phrase "Lord Vader!" Had the desired effect on the titles, as well as unintended effects on some of the patterns themselves.pic.twitter.com/LVHsjPEktF
oh no it's going to be space-devouring hyperbolic ripples again, isn't it? all of these patterns seem to be trapshttps://twitter.com/MiniGirlGeek/status/1147961366243893248 …
tried to avoid hyperbolic chaos by reducing the sampling temperature from 1.0 to 0.8. now the neural net is playing it safer (arguably TOO safe in those 1st two patterns), but is it still going hyperbolic?pic.twitter.com/dYEeMBEJIV
Update: these patterns totally are going hyperbolic. If anything, faster.https://twitter.com/Totesmyname/status/1148000360612581376 …
Apparently exploding hats are nearly inevitable @robin_h_p explains it well in the linked thread - each small excess increase ends up blowing up bc it's multiplied by later rounds. Meanwhile HAT3000 thinks it's still matching human patterns really well.https://twitter.com/robin_h_p/status/1148036891729629184 …
I trained HAT3000 for another 85 iterations to see if it might be able to fix its explodey hat problem. (here sampled with temperature 1.0, top_k 60, prompt text eccentric) The one called "The End..." looks dangerous even to me.pic.twitter.com/LN4Y4L2d07
Tom Boy Pocket Bamboozles did not explode! It does have tentacles thoughhttps://twitter.com/lbreeggem/status/1148620603332730881 …
Meanwhile, Lumpy Top is another hyperbolic brain coral. @PaulaSimone calculates 1605 stitches in each of the outermost rounds.https://twitter.com/PaulaSimone/status/1148806597138759683 …
Over 3500 stitches in one row. This hyperbolic neural net crochet pattern is now eating both yarn and sanity.https://twitter.com/Persipan/status/1149242833360097281 …
Results from a later training point (281 & 320). Prompting with eccentric text (Nobody expected stegosaurs in spaceships. etc) seemed to get induce it to generate interesting titles. Same temp=1 & top_k=60 I don’t know how it decided to try counting piglets instead of stitches.pic.twitter.com/5R5uyoE7kT
Brim Hat Pattern #1708 is still devouring yarn. Starting to become mildly terrifying.https://twitter.com/persipan/status/1153041634885591041?s=21 …
I exported a huge batch of unfiltered HAT3000 patterns - you can read the draft here. https://docs.google.com/document/d/1FjvuBpAeAEk3q5jQ_Q111G4kZV6GWZKTBYqz5z09v6I/edit?usp=sharing … Note that some of the patterns appear to be, um, explicit. HAT3000 is a descendent of GPT-2, which Saw Some Shit on the internet. Prompt suggestions welcome
The larger GPT-2-774 model is released, but I don't have a way to finetune it on crochet yet. But I found that when I prompt it with a title + 2 lines of an existing crochet pattern, it has seen enough crochet on the internet that it already knows what to do.pic.twitter.com/6Thj22BMgu
Lumpy top looks fabulous with red edge!https://twitter.com/paulasimone/status/1169349210036006912?s=21 …
I summarized the (faintly disquieting) story of HAT3000 here:https://twitter.com/JanelleCShane/status/1169310779289456640?s=20 …
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.