Tyndmyr wrote: It is ridiculous to think that those who know least about a technology are best equipped to judge it.
Knowing about a technology has nothing to do with understanding the ramifications. If anybody knew what was going to happen they wouldn't do some things. Would you like examples? We find the future, we don't predict it.
Understanding how a thing works is essential to understanding the ramifications of things. This is why some politicians have looked...kind of idiotic when talking about ramifications of the internet or what not.
Knowledge doesn't prevent all errors, but it IS greatly helpful, and is far superior to....not having knowledge.
Tyndmyr wrote:It is possible that, at some point, we figure out how to do everything that it is possible to do.
There hasn't been anything "new" to do for some years. We do some things differently, as compared to how we used to do them, but we are doing the same things over and over again. We do things to make humans able to live in comfort, such as that concept is. Food, shelter and the things that enable those two. Everything else is fluff. Angry Birds?
Perhaps to you, but labeling it as "fluff" does not change the fact that humans want it, and will pursue it. We pretty much have food, shelter, etc mostly figured out to decent levels, but people have many, many more desires. You can call them needs or not, but from an economic perspective, it doesn't really matter.
Autolykos wrote:And since management and politics is all about "building consensus" on a large scale, as you might put it, psychopathic traits don't seem to be that much of a disadvantage. On the other hand: Try naming just one living, powerful politician with high levels of empathy...
*shrug* Sniping at bosses and politicians is shooting fish in a barrel here. Sure, there have been plenty of those without empathy.
But there have also been some with it. A whole bunch of millionares have signed a pledge to give away half their fortune to charity before they die, and are actively working on that, for instance. That doesn't SOUND like lack of empathy.
morriswalters wrote:Utilitarians have the right of it, if they can define the greatest good for the greatest number sufficiently. Empathy gets in the way.
Why would you wish for the greatest good for the greatest number without empathy? Presumably a total lack of empathy would mean not caring about the suffering of others at all, and therefore "good for others" wouldn't be a particularly desirable goal for such a person.
Because it's practical. I cannot forsee the future. My station in life might change. Therefore, there is a positive value associated with society being better in general, even if it isn't something that helps me.
Plus, there's that whole cooperation thing. Not screwing over my neighbor for no good reason leaves him better disposed toward me. So, while being nice to the neighbors may not be a primary goal...it usually costs me little to do so, and is superior to alternatives. That makes it pretty logical.
Imagining oneself in place of another, and feeling as they do is valuable in terms of determining what they want. Sure, that isn't the only trait you need, but it's helpful information to have. Empathy wouldn't have evolved otherwise.
morriswalters wrote:Maybe. But triage is contradictory. Empathy places me in your shoes. If you are dying and might be saved if I devote enough resources, empathy drives me to do something for you, because I would want you to save me were I in your shoes. But at what cost. This is the primary dilemma of triage, head over heart. Empathy is a built in response, it takes cognitive effort to override it. In terms of AI this is exactly the point. You don't want an AI to feel empathetic to the individual.
Sure I do. Why would I want a less capable AI?