Any conceivable agent capable of cooperating and planning would have an encoding for values. Any agent with values would have emotions.
Conversation
First proposition seems clearly true to me, but second seems provisional; why do you think that's necessarily so?
1
1
Yeah, I totally get how feelings are our value-guides, but don't see stronger claim that no dissimilar value-guide is possible.
1
oh, well this is just about how to define non-human feelings I guess ;)
1
if an intelligence had no notion of attention, yeah maybe its guides wouldn't be like our feelings. Is that possible tho?
1
I think I have to assume it is unless I know of a good explanation for why it isn't.
1
Here's one: it's of the nature of investigation and interaction to allocate sensors and actuators. Even a grey goo would want to focus
1
Replying to
Yep! Is a mosquito’s drive to flee a swatter an emotion/feeling?
Replying to
Not by my def. Only if bugs have value conflicts: e.g, it values staying, but also living, it goes metal and attends to both values as objs

