Presumably, a Shintoist knows a property that distinguishes living animals from dead animals, so not all things will be alive in that sense. The Shintoists may be referring to a different property if it is shared by all things in their ontology, for example state evolution.
-
-
There is another kind of attention that relates to tracking identities during task execution, for instance in the transformer architecture in AI/ML: https://towardsdatascience.com/transformers-141e32e69591 … It may not scale towards consciousness, because it is not reflexive, and representations are not unified.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Wait. Are you saying human attention requires ( or presupposes) memory? My guess is not - except for selected attention.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Looks cool however I'm not a computationalist or much of a neural net guy ( beyond general principles). Where in the computational sequence would you put "attention" or it's equivalent? Would you say "attention" is the sum of a series of steps? Other?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.

In Zen (for example) there is conventially 2 minds. Neurobio ( Small) mind and Big Mind. The reality of small mind has no necessary relationship to the reality of Big Mind. The instance of cncns. in human brain creates Human Reality.