I'm not sure, but I suspect Markov chains can simulate any computable process. So Markov chain speech generators are just primitive GAIs.
-
-
Replying to @MakerOfDecision
@MakerOfDecision They cannot simulate a deterministic process defined as follows: output opposite what a markov-chain learner predicts.1 reply 0 retweets 0 likes -
Replying to @FrameOfStack
@FrameOfStack I'm not sure that is correct. You mean to say that a Turing machine simulates a MC-program and does the opposite?2 replies 0 retweets 0 likes -
Replying to @MakerOfDecision
@MakerOfDecision I mean negate a markov chain LEARNER, for instance PPM. Result diagonalizes markov.1 reply 0 retweets 0 likes -
Replying to @FrameOfStack
@FrameOfStack Thanks! A Diagonalization proof seems sufficient to change my mind ;) I'd like to see it - link?2 replies 0 retweets 0 likes -
Replying to @MakerOfDecision
@MakerOfDecision Real answer is automata theory; markov chains lack a stack (can't simulate context-free) or tape (can't be arbitrary TM).1 reply 0 retweets 1 like
@FrameOfStack Lesson learned: I only took 1 class in finite automata, state machines and complexity. I should take more.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.