Things I wish I had learned at university:
- measuring the efficiency of an algorithm in watts
- difference between classical math and computation
- relationship between geometry and number theory
Conversation
Replying to
Are there systematic ways to do 1 formally based on Landeuer Principle or something?
1
2
Replying to
In practice it often depends on things like implementing your multiplication with a lookup table. I think the Landauer limit would matter if we did best effort computing and could harness the actual amount of determinism we can get from the substrate. We are far from that.
1
4
Replying to
I recall seeing some literature on a concept called “reversible computing” that seemed to be about this too. I guess that’s the synthesis side of power efficiency analysis.
1
2
Replying to
Reversible computing builds on the realization that entropy results from deleting bits. By building gates that permute bits only, you may be more efficient (but you'll still have to flush out the superfluous bits at some point). Best effort computing is largely unrelated.
1
4
You may really enjoy this short introduction to the ideas of Best Effort Computing by Steve Ackley
2
3
16
Replying to
Okay I got lost around the quarks point and this is generally above my CS paygrade but it does look fascinating. So the stochastic cellular automaton simulation is a model of something like a non-robust memory with probabilistic guaranteed state persistence under failure?
2
1
Replying to
Yes, it is an attempt at building a software paradigm that does not rely on guaranteed determinism of the substrate, but on adaptively recruiting as much determinism as needed for the task. This would allow us to reduce energy consumption by many orders of magnitude.
1
2
Replying to
Part I really didn’t get is the hardware. Was it special crappy/unreliable hardware at physical design level?
Replying to
Our current computers run on hardware that stacks the probabilities via over engineering transistors and error correcting codes until we lose practically no bits over gazillions of operations. A synapse will lose a bit in every second operation.
1
3
Ackley wants to open the door to designing software that is robust enough to run on extremely crappy hardware, like a brain. At the moment, he simulates the indeterminism on deterministic hardware, of course.
1
5

