The idea that there is such a thing as analogue computation is a classic oxymoron. The logic of the digital is computation and vice versa. (Now let's see how quickly this thread devolves.)
-
-
Replying to @NegarestaniReza2 replies 0 retweets 1 like
-
Replying to @Josh86480104
But the term computation in such cases is colloquially--in an extremely loose sense--used.
1 reply 0 retweets 2 likes -
Replying to @NegarestaniReza
It's fair to say that it's a sort of retroactive projection, but haven't you then implicitly defined computation in terms of digitality? For most of its history, the terms computation and computer primarily referred 2 analogue processes So it's a question of pick yer definition
1 reply 0 retweets 2 likes -
Replying to @Josh86480104
It's like saying that the early theory of osmotic exchange was there. Yes, but it was just too broad to have any scientific significance. Only when it was formulated under theories of molar concentration, osmotic pressure, etc it became an actual scientific topic ...
3 replies 0 retweets 0 likes -
Replying to @NegarestaniReza
And one can say the say with e.g. atoms - but when you operationalise a variable like that, there is a sleight of hand going on; a sense in which you trim the edges off the original meaning of the concept, it's not entirely commensurable. Church-Turing thesis is somewhat circular
1 reply 0 retweets 0 likes -
Replying to @Josh86480104
No what is a sleight of hand in this? The very definition of the concept is about trimming and calibration of the inferential nodes. Do you want a mallet to hammer in a needle into the furniture of the world or a small rubber hammer?
1 reply 0 retweets 0 likes
You seem to be confusing common-sense concepts with scientific ones.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.