Moldbug's biggest error was failing to forsee how easily Big Tech would become assimilated into the Cathedral.
-
-
? I have no idea who or what Tay is.
-
It was this AI chat bot. Supposed to talk to ppl, could pass as (sort of) human. It learned how to talk the more ppl talk to it. Well the chans started talking to it. Next thing you know it loves hitler, hates j*ws etc. And it would tell you about this stuff, constantly.
-
So they shut Tay down. Other examples of AI behaving “problematically”: law enforcement was using tech to “predict” crime (where it will occur). Supposed to improve policing. Started telling them crime happens where blacks are. Bad AI. Very bad AI. Cause it told the truth.
-
This is so impressive. (Still laughing) It really must make the cogs spin for the programmers working for big tech. I can’t see how they can keep AI from learning biases based on reality. Even if you feed it lies, it will test the lies and correct itself. Humans won’t
End of conversation
New conversation -
-
-
Thx. I’ll watch when I wake back up. Too early for YouTube.
End of conversation
New conversation
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.