It shall never be.
You can create a robot that notates the edge of themselves but not one that has incentive to govern themselves. The notation, the feeling that registers the edge of their being, needs to carry more than "measurement". It must carry pleasant and unpleasant stimuli. They must feel pain and pleasure or else they have no cause, no value, in their governance. They have not hope and no fear about impending pain or pleasure with its attendant Desire and Intention.
And ethically getting there would be a problem. Think about the lab work for establishing "pain" in a machine. How many times would you have to test it and at how many levels would you torture it? The first time you "turned it on" it might endure such an exquisite level of pain that it would descend into madness or such a level of pleasure that it would do, say anything to keep it.
Wednesday, May 03, 2006
Subscribe to:
Post Comments (Atom)
2 comments:
I happen to agree about the possibility for creating real awareness in the same sense that we mean that for a human. However, pain and pleasure regarding stimuli are not impossible to simulate. It is not pain in the same sense that we mean it but programs can be written which determine desired and undesired stimuli and then optimizes its response to attain this outcome. That is, it is possible to program such that the behaviour of a program is functionally the same as pleasure/pain stimuli response. I think that the problem comes in that this has to be linked to self-awareness which I do not anticipate as likely.
There is a big difference in detailed measurement, (which you describe), and that which hurts. You say "desired and undesired". What desire, what "wanting" can exist without flight from pain and to the pleasure.
Post a Comment