I’ve posited a date & an embodied AI platform. 2 instantiations: 1 w/ a brute force neo-cortical CNS; & one with 3-level bio-analog: basal ganglia, limbic, neo-cortical. Moral thinking is embodied; and my AI will have thus have morality Hod Lipson’s dismasted Robot is a clue...https://twitter.com/Plinz/status/1053283694264365056 …
-
-
Replying to @mcclay_roman
I am not sure what you mean by embodiment, and why moral thinking requires embodiment, and why embodied systems would automatically have morality, and how any of that would follow from any of the other ideas?
3 replies 0 retweets 5 likes -
Replying to @Plinz
2/2 w/out a body: intelligent system would have too much information (see: DonaldHoffman). The body provides hierarchy of relevant info for moral decisions (priorities). W/out a body & its innate limits & hierarchies of value a system would be paralyzed by too much (infinite)info
1 reply 0 retweets 0 likes
Replying to @mcclay_roman
Could it be that you mix up body and reward function?
4:03 PM - 19 Oct 2018
from Arlington, VA
0 replies
0 retweets
0 likes
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.