The idea that there is such a thing as analogue computation is a classic oxymoron. The logic of the digital is computation and vice versa. (Now let's see how quickly this thread devolves.)
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
eh? How else do you want to root analog computers in discrete math then? You could make an argument about branches being discrete for example, though my experience has been that I didn't use any branches at all really. You'd totally get emerging bifurcations though
-
I just think you think discreet math is like a pure black and white movie. It is not. Once you have a thesis about effective computation you can equip it with fuzzy logics of all sorts. That's not a problem.
- 2 more replies
New conversation -
-
-
i thought it was Austin nowadays.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.