Moldbug's biggest error was failing to forsee how easily Big Tech would become assimilated into the Cathedral.
-
-
So they shut Tay down. Other examples of AI behaving “problematically”: law enforcement was using tech to “predict” crime (where it will occur). Supposed to improve policing. Started telling them crime happens where blacks are. Bad AI. Very bad AI. Cause it told the truth.
-
This is so impressive. (Still laughing) It really must make the cogs spin for the programmers working for big tech. I can’t see how they can keep AI from learning biases based on reality. Even if you feed it lies, it will test the lies and correct itself. Humans won’t
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.