Image descriptions on @twitter are great!
But I've been thinking about how an auto-generated description -- that you must edit for acceptance -- could triangulate on good and useful descriptions and an awesome training dataset for machines (and guide for people).
-
Show this thread
-
cc:
@KLdivergence and@jeffbigham to figure out why and how I'm wrong.3 replies 0 retweets 2 likesShow this thread -
Have you seen the ImageNet roulette thing going around? I believe most of the images that made up the training set were hand-labeled by humans. Short answer is because humans are terrible and like to label images with cruel stereotypes.
2 replies 1 retweet 6 likes -
Yea, I can see that. Narrowing the training set on twitter would certainly be difficult (e.g Tay bot). But if the set of <(tweet, image), description> examples came from a11y twitter, I'd love to see something like
@HemingwayApp for descriptions because I'm bad at them.2 replies 0 retweets 0 likes -
Replying to @generativist @KLdivergence and
Also, I doubt its possible at twitter scale, unless its a mobile/desktop instated model. I don't think such a model could possibly be made at auto-crop net speeds (https://blog.twitter.com/engineering/en_us/topics/infrastructure/2018/Smart-Auto-Cropping-of-Images.html …).
1 reply 0 retweets 0 likes
In any case, thanks!
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.