Obviously I’m not a Strong AI guy, and am pretty much in the David Chalmers camp on the hard problem.
Conversation
I’m not saying this quite right. An intelligence exists within a thermodynamic boundary that separates it from the environment but firs not *isolate* it. The nature of the intelligence is entangled with the specific environment and the boundary actually embodies much of it.
5
2
15
Replying to
Put philosophically: intelligence is a lens that focuses information in the environment. Contemporary statistical inference AIs live in the informational equivalent of an industrial farming monoculture. Yes it has access to infinite corn. But not the variety of a real environment
1
1
8
Didn’t you have a framework for how it’s more important to think about the right things than to think more effectively (in my sloppy half-remembered summary version)?
1
Replying to
The boundary intelligence thread linked in the next thread is sort of along those lines
1
This?
Quote Tweet
1/ I'd like to make up a theory of intelligence based on a 2-element ontology: boundary and interior intelligence
Show this thread
1
Replying to
Yes exactly! This is one of the big things AI folk miss about consciousness. They treat it as an emergence, the crowning achievement of a sufficiently advanced system. But neurologically it’s really just a hypertrophied boundary filter.
1
1
Replying to
I think those of us who start out in the more situated end of thinking about AI tend to get to this view more usefully. It’s generally an East-coast way of thinking, far from network effects and things going whoosh driven by Moore’s law scaling parameters
1
Replying to
Totally. One of the few places in AI that works on this is Interactive Machine Learning which started in robotics. Active Learning (trying to identify samples that maximize marginal learning) comes from trying to avoid boring a human interaction partner: burrsettles.com/pub/settles.ac
1
1
Another frame: brains are evolution’s way of letting individuals keep adapting a little bit after birth. But most adaptation (aka learning) is still encoded in genes. ML is like trying to go straight from organic chemistry to brains without DNA and millions of years of selection
1
1
Replying to
That’s a good point. The equivalent would be evolutionary history of semiconductor hardware which they usually kinda ignore on the specifics. To me that’s one of the most fascinating threads.

