Conversation

Replying to and
Yes that's fair and it seems like it, but we can prove it easily. It can explain existing bb models and show that the explainable output is the only possible solution.
2
1
Replying to and
Hmm. By definition if you can produce a simple narrative about how an AI made a decision, to the degree of rigor you claim, then there’s no need for the AI as such. Might be a good thing, but count me skeptical.
1
Replying to and
That is a very astute observation. The tech itself is an advanced algorithm that produces perfectly explainable models. On accident we disco'd that it can also perfectly explain any existing feed-forward models. One of the 1st needs was to "illuminate" existing bb models.
1