Tyndmyr wrote:While you are absolutely correct on our lack of present understanding, it can obviously be programmed in.
Not so fast. It has yet to be demonstrated that humans are Turing complete.
How could we not be? Granted, we have limited memory, so that's a handicap in an absolute respect, but this is not actually different from designed computers, so from the practical standpoint of applicability to this question, who cares? Sure, fully simulating a human will require a certain degree of storage space(and it's not trivial), but it's not impossible.
I do not doubt that we will, one day, be able to create machines that react in unexpected ways, and have their own desires that are complex enough to be worthy of the shortcut word "emotion". What I think will never happen is that we will have anything like an "emotion chip" (a la Star Trek), because emotions are not like that.
Meh. It's easier to make a chatbot behave as a person does when angry and insulting than otherwise, because angry speech is usually less complex. There's nothing magic about emotions.
And emotions are certainly not the output of the logical part of the brain, which is what I mean by being "inherently irrational". There is a (social) calculus involved in all relationships, but the kind of devotion credited to "love" is not a rational decision. There may be people who stay together as a devoted couple (or more) based on their version of utility theory, but that's not what I call love. And love can be overruled by the rational part too; one can recognize that one is falling in love with "the wrong person" and do something about it. But that's not the same thing either.
There isn't one half of the brain labeled logic, and one half labeled emotion. Any such labeling systems are ludicrously over-simplified garbage. At best, we're hooking electrodes to people's heads and observing that, hey, in most people, this area lights up more in these circumstances. Imagine trying to describe a computer by looking at disk activity on the hard drive. It's...informative, perhaps, but incredibly limited, and missing a great deal.
ahammel wrote:What I mean is that the idea that there is a part of the vain that deals with logic which is completely district from the part that deals with emotion and which is somehow more computer-like than the rest is false.
Sure, if your bar is "completely distinct"; it all comes from the same wetware. However, it does seem to me (and supported by the fact that animals don't seem to do propositional calculus but do form emotional bonds) that what we call "higher functions" (the ability to do propositional calculus, for example), does occur in a different part of the brain - one that has developed much more in humans than in other creatures. In animals (and people), emotions came first. Logical reasoning came later. In computers it's the other way around, it seems.
Clearly, since some people don't do propositional calculus, for instance, they are obviously missing this part of the brain. Which MUST be distinct, because reasons.
This is indeed nonsense. Animals can and do solve fairly difficult physics problems sometimes, and have a degree of intelligence. How much varies wildly depending on the species and even the individual, but it's really, really difficult to communicate with them on abstract topics. The idea that the order in which things are developed matter for computers is ludicrous, and the idea that developing something at a different time means it must be discrete is also clearly wrong.
There isn't even a single skill called "logic". It's a term used to describe a whole bunch of different things. Intelligence is a wild collection of things that is terribly complicated, it can't really be accurately dealt with by simply tossing things into logic and emotion.