Things I wish I had learned at university: - measuring the efficiency of an algorithm in watts - difference between classical math and computation - relationship between geometry and number theory
-
-
Replying to @Plinz
Are there systematic ways to do 1 formally based on Landeuer Principle or something?
1 reply 0 retweets 2 likes -
Replying to @vgr
In practice it often depends on things like implementing your multiplication with a lookup table. I think the Landauer limit would matter if we did best effort computing and could harness the actual amount of determinism we can get from the substrate. We are far from that.
1 reply 0 retweets 4 likes -
Replying to @Plinz
I recall seeing some literature on a concept called “reversible computing” that seemed to be about this too. I guess that’s the synthesis side of power efficiency analysis.
1 reply 0 retweets 1 like -
Replying to @vgr
Reversible computing builds on the realization that entropy results from deleting bits. By building gates that permute bits only, you may be more efficient (but you'll still have to flush out the superfluous bits at some point). Best effort computing is largely unrelated.
1 reply 0 retweets 4 likes -
You may really enjoy this short introduction to the ideas of Best Effort Computing by Steve Ackleyhttps://www.youtube.com/watch?v=I4flQ8XdvJM …
2 replies 1 retweet 15 likes -
Replying to @Plinz
Okay I got lost around the quarks point and this is generally above my CS paygrade but it does look fascinating. So the stochastic cellular automaton simulation is a model of something like a non-robust memory with probabilistic guaranteed state persistence under failure?
2 replies 0 retweets 1 like -
Replying to @vgr
Yes, it is an attempt at building a software paradigm that does not rely on guaranteed determinism of the substrate, but on adaptively recruiting as much determinism as needed for the task. This would allow us to reduce energy consumption by many orders of magnitude.
1 reply 0 retweets 2 likes -
Replying to @Plinz
Part I really didn’t get is the hardware. Was it special crappy/unreliable hardware at physical design level?
1 reply 0 retweets 0 likes
Our current computers run on hardware that stacks the probabilities via over engineering transistors and error correcting codes until we lose practically no bits over gazillions of operations. A synapse will lose a bit in every second operation.
-
-
Ackley wants to open the door to designing software that is robust enough to run on extremely crappy hardware, like a brain. At the moment, he simulates the indeterminism on deterministic hardware, of course.
1 reply 1 retweet 5 likes -
Also check out Dave's T2 project, where he is actively creating a tile-based computing system based on many of his prior ideas:https://www.youtube.com/watch?v=pO2CA1RilLA …
0 replies 0 retweets 4 likes
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.