The idea that there is such a thing as analogue computation is a classic oxymoron. The logic of the digital is computation and vice versa. (Now let's see how quickly this thread devolves.)
eh? How else do you want to root analog computers in discrete math then? You could make an argument about branches being discrete for example, though my experience has been that I didn't use any branches at all really. You'd totally get emerging bifurcations though
-
-
I just think you think discreet math is like a pure black and white movie. It is not. Once you have a thesis about effective computation you can equip it with fuzzy logics of all sorts. That's not a problem.
-
I was using "discrete" as a stand-in for "digital" because I can count only integers (with certain tricks, rationals) on my fingers (digits). Maybe yours can count reals, in which case yeah cool, go ahead call your fuzzy logics digital!
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.