Just struck me that an important AI problem, formal logic reasoning, is actually defined at the wrong level. The problem is not to get a computer to do logic, but to get it to conclude logic is in fact a thing to do, and invent/discover the *idea* of logic via ML
Conversation
Replying to
Logic is easy. Discovering that logic is a thing you can do, uncovering its rules, and deciding when to use them, is the hard part.
2
4
26
There is like zero line of sight to how to do this in current leading edge ML research.
2
1
7
Or to generalize, the problem isn't to synthesize GOFAI and deep learning. The problem is to get deep learning to discover/invent GOFAI. While embodied as a robot.
2
1
12
The best approach I can think of would be to make "rules" a learning domain somehow. You need to dataify rules somehow. Like maybe learn off a huge corpus of rules of various levels of rigor for a domain, from superstitions to heuristics to rules with formal properties
4
7
I think the mistake made in the past is to distinguish strongly between informal and rigorous reasoning, elevate the latter, and solve it in isolation. It's a spectrum and necessarily so. Watertight formal logic doesn't work in a useful way when divorced from superstitions.
2
15
This sort of thing seems like it's the right idea twitter.com/alexsteer/stat
This Tweet is unavailable.
5
Replying to
I’ve actually been working on something like this for probabilistic reasoning, not logic
Replying to
It’s pretty interesting to go back in human history and see when humans started becoming more analytical / logical
It all happened when coffee replaced alcohol as the daily drink (water would get you sick, but fermenting/heating killed bacteria) — the scientific method was born



