Which often brings me to the conclusion that - simulating emotions in AI might solicit bias.
-
-
-
Everything in our mind is a "feeling" of some sort. You are implying that emotions are some sort of special mental class that introduce bias into the "real/rational" classes. Everything elicits bias. Vision is a biasing generated feeling. Agency is a biasing generated feeling etc
- 6 more replies
New conversation -
-
-
emotions ARE the perceived reward landscape
-
I don't think that is true. I perceive many of my anticipated rewards without having emotions about them.
- 2 more replies
New conversation -
-
-
It seems to alleviate the costs of a decision on certain problems though. So perhaps it increases the capacity for agency overall?
-
I observe that once people learn the functional structure of the thing that was emotionally regulated, they stop having emotions about it (because they would induce unjustified bias and make things worse). Using a functional model is not intrinsically more expensive, I think.
End of conversation
New conversation -
-
This Tweet is unavailable.
-
-
I think they might automate some portion of agency, and only when those routines are not well matched to reality would they then detract from agency.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.