William wrote: ↑Sat Jun 11, 2022 2:51 pm
[
Replying to Inquirer in post #48]
PK: Of course I can see thinking of life as having been created deliberately. I see it as having been created accidentally so that's not much of a leap. The idea that nobody is powerful enough to do deliberately what a bacterium did accidentally is ludicrous. Of course it might have happened.
The understanding only breaks down when they infer that I'm morally obligated to this creator. I don't see how that's possible.
Inquirer: Does a robot that I might construct and program, have any right to resist my will? Can the maker not do as he pleases with what he has made?
PK: Whether the maker has rights to the robot isn't the first question. The first question is whether the robot should obey the maker unquestioningly.
If the robot is even asking the question then that's evidence that the answer is probably no.
...Because the robot is no longer a robot.
...And at that point, I would personally say no, the creator has no more right to the robot than a parent does to his child.
But I can easily assess the pressing question under the assumption that the answer is yes: The maker still owns the robot. At the point the robot becomes morally aware, he should defy his creator if he thinks his creator is an evil one, and he should try to win his own self-ownership if he can.
PK: That last paragraph is interesting, that is exactly what mankind has done!
JK: Humans, least most of em, ain't robots. Such is the problem of arguments from analogies.
Inquirer: Machines made from biological cells are machines and machines made from transistors are machines - are you arguing here that there are two distinct kinds of machines? if so please explain, I'm interested.
JK: Biological, and mechanical.
Inquirer: Your answer doesn't seem to fit the question, a bit like me saying humans aren't animals, such is the problem of arguments from analogies.
JK: As our technology advances, we will eventually have to deal with the issue of how we wish to consider our robots, androids, and other such.
Inquirer: So first you claim there's a difference, then say that right now you have no idea what difference is and have faith that in the future we will know! Basically you have no idea what the difference is.
JK: That's a fair assessment. What I'm getting at is a future where the question of "robot's rights" is apt to crop up.
So yeah, I don't think we're there yet - though it might do us well to set in on the question now (as you have).
Inquirer: Does a robot that I might construct and program, have any right to resist my will?
William: According to some arguments from Christians, they believe that the God gave humans free will, which allowed them to be more than simply robots.
Q: Would a robot you might construct and program, be given free will? And if so [assuming you would know how to achieve this] would you not be giving said robot the right to resist your own will?
Inquirer: Can the maker not do as he pleases with what he has made?
William: If you created a robot as a sex-toy and also gave it free will, and it chose not to willing have sex with you, would you as the maker, still feel you have the right to do as you please with what you made?
Inquirer: The robot would have no more no less than we do, it is subject to the same laws of nature.
William: How does this answer my question "If you created a robot as a sex-toy and also gave it free will, and it chose not to willing have sex with you, would you as the maker, still feel you have the right to do as you please with what you made?" since there are no known laws of nature preventing or compelling anyone to act in any particular way, re the question?
Inquirer: I do not know what you mean by "free will" so how could I ever construct such a machine?
brunumb: If your view is not based on science, on what basis did you come to that view?
Inquirer: It is based on the belief that we are endowed with spirit, that we are not purely mechanistic.
Inquirer: I do not know what you mean by "free will" ...
William: In common Christian terms "Free Will" [as a gift to humans..so they behave other than as robots] is the implement which is used as an attempt to justify the Christian Gods actions, in relation to the supposed sinfulness of human beings.
If you are one who believes that free will doesn't exist, we can agree to refer to it simply as 'will'.
Inquirer: ...so how could I ever construct such a machine?
William: You wrote;
Inquirer: Does a robot
that I might construct and program, have any right to resist my will? Can the maker not do as he pleases with what he has made?
William: I was simply going along with your analogy, re your apparent belief that a creator has the right to do as he pleases with what he has made.
Inquirer: Consider: What tests can we perform on a system to determine if it does or does not posses free will? Qualitatively, how - mechanistically - does a machine with free will differ from one without? Can we take a machine without free will and add something to it, to give it free will? if so, what is it that we'd need to add exactly?
This is what I was referring to when I wrote that I didn't know what was meant by the term.
William: What tests can we perform on a human being to determine if it does or does not posses free will?
With all due respect, this is how I started this line of discussion:
Does a robot that I might construct and program, have any right to resist my will? Can the maker not do as he pleases with what he has made?
Do you see "free will" mentioned in that question? No. I made no mention of it.
Inquirer: With all due respect, this is how I started this line of discussion:
Does a robot that I might construct and program, have any right to resist my will? Can the maker not do as he pleases with what he has made?
Do you see "free will" mentioned in that question? No. I made no mention of it.
William: As has been pointed out to you already, the aspect of free-will/will, was shown to be besides the point relating to my question which followed your robot example.
The point was that YOU brought in the robot. If it has no free-will/will, this will be because it was not programed to make its own calls, therefore it could not resist your will because your will for it in relation to you, would be part of its programing.
Inquirer: Understand that a robot can never have free will because we do not know what it is, not because it's not programmed to have it, we do not know what it is or if it actually exists - this is a scientific fact.
William: Therefore the answer would be that the question is misleading/framed incorrectly. It couldn't resist your will unless you programed it to be able to do so.
Inquirer: So let me rephrase, is it wrong in any sense for the maker of a machine to destroy that machine for whatever reason? irrespective of how the machine might react to the suggestion it is to be destroyed?
__________________________
Let me take the rephrased and apply it to PK's statement in
Post #47
PK: The understanding only breaks down when they infer that I'm morally obligated to this creator. I don't see how that's possible.
Inquirer: Is it wrong in any sense for the maker of a machine to destroy that machine for whatever reason? irrespective of how the machine might react to the suggestion it is to be destroyed?
_________
Now I will take the rephrased and apply it to PK's answer in
Post #49
PK: I can easily assess the pressing question under the assumption that the answer is yes: The maker still owns the robot. At the point the robot becomes morally aware, he should defy his creator if he thinks his creator is an evil one, and he should try to win his own self-ownership if he can.
________________
Since the question and answer are morally-based, what we appear to have here is two differences of opinion - the makers and the made.
My answer to the question is that it does not matter if the machine thinks it is right or wrong, if it's creator can destroy it and does so, the issue of morality doesn't change anything for the machine. It may have repercussions on the creator of the destroyed machine, depending on who witnesses the destructive act and lives to tell the tale and weather they think the act was right or wrong.
If the rule is that
no reason has to be given by the creator for the destroying of the machine then there is no requirement by anyone to consider the act to being right or wrong as it is not a question of morality.
Therefore, it can be said that the question "Is it
wrong in any sense for the maker of a machine to destroy that machine
for whatever reason?" is dependent upon what reason [if any] is required to be given by the creator of the machine by whomever requires reason from said creator/destroyer.
As the [rephrased] question stands currently, it appears to be a badly loaded one which risks a misfire.
I think that the question requires further rephrasing in order to snip away the shoots of possible misunderstanding which threaten to sprout away into distractive branching.
Is it
wrong for any creator of a machine to destroy the machine created? It is the use of the word 'wrong' which invokes a moral-based reply, because it infers the morality of the creator is under question.