Image descriptions on @twitter are great!
But I've been thinking about how an auto-generated description -- that you must edit for acceptance -- could triangulate on good and useful descriptions and an awesome training dataset for machines (and guide for people).
Yea, I can see that. Narrowing the training set on twitter would certainly be difficult (e.g Tay bot). But if the set of <(tweet, image), description> examples came from a11y twitter, I'd love to see something like @HemingwayApp for descriptions because I'm bad at them.
-
-
Also, I doubt its possible at twitter scale, unless its a mobile/desktop instated model. I don't think such a model could possibly be made at auto-crop net speeds (https://blog.twitter.com/engineering/en_us/topics/infrastructure/2018/Smart-Auto-Cropping-of-Images.html …).
-
In any case, thanks!
End of conversation
New conversation -
-
-
i generally find the absolute certainty with which ML folks refer to Tay's implications troubling… to me, all Tay was is an example about why experimental design and HCI matter… given that Tay was just kind of released w/out much/any thought
-
eg, we had a facebook bot that asked FB users to answer questions for blind ppl; ppl often wrote kind of offensive things… but, then we asked for volunteers willing to have questions auto-asked from their account on behalf of another, answers were great! https://www.cs.cmu.edu/~jbigham/pubs/pdfs/2015/socialmicrovolunteering.pdf …
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.