The one thing I'd agree with you on is that the wikipedia model is not in fact generalizable at all. It remains the n=1 sample proof point of too many arguments.
Conversation
This Tweet was deleted by the Tweet author. Learn more
I think the future is knowledge being embedded in context-aware information toolchains. Tvtropes points to the future better than wikipedia, though it captures a not-quite-functional knowledge.
4
This Tweet was deleted by the Tweet author. Learn more
This Tweet was deleted by the Tweet author. Learn more
I'm all for that kind of polycentric-narrative post-canonicity local truth ground world. I'd rather be an eager, great citizen of tvtropes than a reluctant, coerced one of the local town hall overrun by NIMBYs.
3
This Tweet was deleted by the Tweet author. Learn more
This Tweet was deleted by the Tweet author. Learn more
This Tweet was deleted by the Tweet author. Learn more
This Tweet was deleted by the Tweet author. Learn more
I think that is not an ideal world. I think what you call lolcow watching is actually the beginning of a default, and serious new cognitive mode. It's only bunnyholing if there's a main canonical-path to get distracted from. Ironic bunnyholing = recognizing your canon is local
