Trolley Problem

For the serious discussion of weighty matters and worldly issues. No off-topic posts allowed.

Moderators: Azrael, Moderators General, Prelates

User avatar
TheGrammarBolshevik
Posts: 4878
Joined: Mon Jun 30, 2008 2:12 am UTC
Location: Going to and fro in the earth, and walking up and down in it.

Re: Trolly Problem

Postby TheGrammarBolshevik » Fri Apr 10, 2015 6:44 pm UTC

Cradarc wrote:Okay so my choice of the word "whim" was not good. How about "volition"?

I don't think that helps things, since switch-pullers also don't think you can kill people just because you have a volition to kill people.

Cradarc wrote:
it should be something that people are actually inclined to accept as true, and shouldn't be something with obvious counterexamples.

"People" = who? I'm not a person? If I find a single other person on the planet that agrees with me, that makes people.

Seeing as how this is a discussion, I would think that the goal would be to appeal to principles that most people would find plausible, or perhaps the people you're talking to. The usual idea is that, if you have a belief, P, that most people don't think is plausible, you come up with some things that most people do think are plausible, and show how those things lend support to P. Instead, what you're doing is taking a belief that nobody thinks is plausible, and then cooking up "reasons" for that belief which are no more plausible than the original belief. And, along the way, saying lots of absurd stuff about what people with opposing positions think ("Switch-pullers think they can kill people whenever they will to kill people").

Cradarc wrote:
Why shouldn't the very low bar that you're setting for discussion also apply to the Flat Earth Society?

It can, which is why they exist and can make a logical argument for it. It's just that when it comes to the shape of the earth, everybody here (I think) share the same position. So it is common ground.

You can't force someone to make the same fundamental assumptions as you. There's no logical reason why your assumptions are superior to theirs. It's a matter of faith. Every person lives with some fundamental assumptions which they deem "correct". Everything else they believe in then builds off those assumptions. One can be fully convinced that one is right while still conceding that there is no logical basis for thinking that.
ex.
Is reality a simulation? We don't know, but most of us assume it is not and build knowledge off of that assumption.

Above, you contemplated that the bar for a good argument need be no higher than that the argument appear convincing to the person actually making the argument. Here, you say that the belief in a round earth is no more reasonable than a belief in a flat earth, but is instead distinguished by a matter of faith.

When you find yourself saying such things, perhaps it is time to reconsider your standards for what counts as a good reason for belief.
Last edited by TheGrammarBolshevik on Fri Apr 10, 2015 7:22 pm UTC, edited 1 time in total.
Nothing rhymes with orange,
Not even sporange.

Tyndmyr
Posts: 10119
Joined: Wed Jul 25, 2012 8:38 pm UTC

Re: Trolly Problem

Postby Tyndmyr » Fri Apr 10, 2015 6:44 pm UTC

Cradarc wrote:Forest Goose,
Right now I am demonstrating a hole in your argument. You have already demonstrated the unprovability of my argument and now I will do the same for yours.

You agreed with the statement: "There are times when a person has the right to end the life of another."
But in order for your "proof" to hold, you must show that everyone who uses logic must agree with said statement.

I am asserting that it is possible for a logical person to disagree with that statement.
^Do you agree with me on this specific assertion?


That's...not logic. That's just agreement. Agreement or disagreement may happen for logical or other means, and in any case, a mere assertation would not be a disproof. You would wish to show that either the argument is structurally unsound, or that the argument is based on faulty premesis.

User avatar
TheGrammarBolshevik
Posts: 4878
Joined: Mon Jun 30, 2008 2:12 am UTC
Location: Going to and fro in the earth, and walking up and down in it.

Re: Trolly Problem

Postby TheGrammarBolshevik » Fri Apr 10, 2015 6:46 pm UTC

Cradarc wrote:You agreed with the statement: "There are times when a person has the right to end the life of another."
But in order for your "proof" to hold, you must show that everyone who uses logic must agree with said statement.

Why should anyone accept this bizarrely stringent requirement on the quality of an argument? Especially coming from someone who thinks that the Flat Earth Society makes reasonable assumptions...
Nothing rhymes with orange,
Not even sporange.

User avatar
Forest Goose
Posts: 377
Joined: Sat May 18, 2013 9:27 am UTC

Re: Trolly Problem

Postby Forest Goose » Fri Apr 10, 2015 6:47 pm UTC

Cradarc wrote:Forest Goose,
Right now I am demonstrating a hole in your argument. You have already demonstrated the unprovability of my argument and now I will do the same for yours.

You agreed with the statement: "There are times when a person has the right to end the life of another."
But in order for your "proof" to hold, you must show that everyone who uses logic must agree with said statement.

I am asserting that it is possible for a logical person to disagree with that statement.
^Do you agree with me on this specific assertion?


I never said anything like what you are asserting. Elaborate or quote me.

I'm not even sure what a "Logical person" is in a justifying sense. This seems to be your own nonsense.
Forest Goose: A rare, but wily, form of goose; best known for dropping on unsuspecting hikers, from trees, to steal sweets.

Chen
Posts: 5096
Joined: Fri Jul 25, 2008 6:53 pm UTC
Location: Montreal

Re: Trolly Problem

Postby Chen » Fri Apr 10, 2015 6:48 pm UTC

Cradarc wrote:You agreed with the statement: "There are times when a person has the right to end the life of another."
But in order for your "proof" to hold, you must show that everyone who uses logic must agree with said statement.


Uh what?

There was no proof in the statement. It was an assertion. You can give reasoning on why you believe that assertion is correct though. Giving examples that support said assertion such as self-defense or defense of another. Or, perhaps more relevant to this thread, ending one person's life to save multiple other lives. Now clearly these examples in and of themselves can be broken down further. Why is killing in self-defense, why are many lives better than one etc etc. I'm pretty sure these have already been discussed in the thread though.

User avatar
gmalivuk
GNU Terry Pratchett
Posts: 25454
Joined: Wed Feb 28, 2007 6:02 pm UTC
Location: Here and There
Contact:

Re: Trolly Problem

Postby gmalivuk » Fri Apr 10, 2015 7:15 pm UTC

Cradarc wrote:You agreed with the statement: "There are times when a person has the right to end the life of another."
But in order for your "proof" to hold, you must show that everyone who uses logic must agree with said statement.
That is not how proofs or logic work.

For a proof to hold, everyone who logically infers things from the same consistent set of premises must agree with the conclusion, but the premises and the inferences themselves are crucial elements.
---
It's kind of like having the conclusion of a mathematical proof be, "Therefore x is irrational," and you come along and claim you can demonstrate a hole in the proof by finding a logical person who believes the statement, "x is rational," without worrying about whether they're talking about the same x.
Unless stated otherwise, I do not care whether a statement, by itself, constitutes a persuasive political argument. I care whether it's true.
---
If this post has math that doesn't work for you, use TeX the World for Firefox or Chrome

(he/him/his)

morriswalters
Posts: 6553
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: Trolly Problem

Postby morriswalters » Fri Apr 10, 2015 7:20 pm UTC

Cradarc wrote:Morriswalters,
That is an interesting question. In that scenario, by following the orders of the father, I am consciously making myself a proxy for whatever he is doing.
He chose to pull the lever to kill himself in order to save his daughter, but he could not successfully perform that action without involving me.

So the question is "Is it right for me to do something I think is immoral if someone else thinks it is moral?". The answer is no, because I think my morality trumps his morality. This is not arrogance, but follows logically. If I thought his morality is better, why would I have my own sense of morality? I would simply adopt his. Whatever my moral code is, it is by definition, what I think is the best.
Now if he had direct access to the lever, I would think his decision is immoral, but I would not stop him. This is because the moral principles that govern my own decisions are different than the moral principles that govern how I respond to other people's decisions.
So if you don't control the lever it is ethical for you not to interfere with the agency of others. Is that a correct interpretation of your statement? My own answer to the question would refer to my belief that we should be able to choose our own destiny. My goal would be to allow him to choose rather than to make the choice for him. Of course that isn't a better solution it is simply mine.

So a new question. If instead there were two levers, one in the hands of the person you were about to strike and one in your hands in the trolly, would you let him move the lever if you knew his intent was to commit murder?

Cradarc
Posts: 448
Joined: Fri Nov 28, 2014 11:30 pm UTC

Re: Trolly Problem

Postby Cradarc » Fri Apr 10, 2015 7:29 pm UTC

gmalivuk wrote:For a proof to hold, everyone who logically infers things from the same consistent set of premises must agree with the conclusion, but the premises and the inferences themselves are crucial elements.


Exactly. In mathematics, there is consensus about what axioms to start from. In ethics there isn't. I started from a set of premises: X and Forest Goose (and perhaps yourself) started from a different set of premises: Y.

You guys were asking me to justify the premises in X. I couldn't do that for you. I am now demonstrating you guys cannot justify the premises in Y. It sounds stupid, and it is, but that is indeed what this thread has devolved into.
This is a block of text that can be added to posts you make. There is a 300 character limit.

Tyndmyr
Posts: 10119
Joined: Wed Jul 25, 2012 8:38 pm UTC

Re: Trolly Problem

Postby Tyndmyr » Fri Apr 10, 2015 7:35 pm UTC

Cradarc wrote:
gmalivuk wrote:For a proof to hold, everyone who logically infers things from the same consistent set of premises must agree with the conclusion, but the premises and the inferences themselves are crucial elements.


Exactly. In mathematics, there is consensus about what axioms to start from. In ethics there isn't. I started from a set of premises: X and Forest Goose (and perhaps yourself) started from a different set of premises: Y.

You guys were asking me to justify the premises in X. I couldn't do that for you. I am now demonstrating you guys cannot justify the premises in Y. It sounds stupid, and it is, but that is indeed what this thread has devolved into.


Trolley levers are not at an "irreducible axiom" level. In fact, the whole point of it is why you make the choice. It's intended to give an example to discuss differing ethics, and explain why you believe differently. My answers were not the same as many others, but I understand why they selected what they did, and the ethical framework behind their choices makes sense, even if I do not hold to that particular system myself.

Yours does not make sense to me, and you appear to be refusing to actually defend it, instead making use of diversionary tactics, arbitrary claims, and so forth. This(and an apparently similar lack of understanding on the part of others) has made the thread fairly hard to follow.

User avatar
Forest Goose
Posts: 377
Joined: Sat May 18, 2013 9:27 am UTC

Re: Trolly Problem

Postby Forest Goose » Fri Apr 10, 2015 7:41 pm UTC

Cradarc wrote:Exactly. In mathematics, there is consensus about what axioms to start from. In ethics there isn't. I started from a set of premises: X and Forest Goose (and perhaps yourself) started from a different set of premises: Y.

You guys were asking me to justify the premises in X. I couldn't do that for you. I am now demonstrating you guys cannot justify the premises in Y. It sounds stupid, and it is, but that is indeed what this thread has devolved into.


I still maintain you haven't presented anything resembling a justification.

Also, that's assuming different types of justification. The same exact thing applies to anything that is not mathematics, which, obviously, is nonsense.

I think the problem is that you don't have any real notion of the subject you are discussing, but have an answer and are attached to it - that would be fine for "Discuss this over beers", it really isn't fine for a serious discussion of the subject. No offense, not everyone knows the same subjects - that you don't even appear sure what exactly everyone is arguing (and we're not all arguing the same thing) and that you have yet to put forward any clear argument is what makes me believe that.

And, still, none of that precludes there being an answer, nor does it preclude positions being reasonable. Hell, by your standard, science doesn't exist.

*Oh, and no, there is not a consensus about axioms in mathematics - I study all sorts of different systems, most of them aren't even comparable. (That even holds in specific areas of mathematics...its actually a branch of foundations...)
Forest Goose: A rare, but wily, form of goose; best known for dropping on unsuspecting hikers, from trees, to steal sweets.

User avatar
TheGrammarBolshevik
Posts: 4878
Joined: Mon Jun 30, 2008 2:12 am UTC
Location: Going to and fro in the earth, and walking up and down in it.

Re: Trolly Problem

Postby TheGrammarBolshevik » Fri Apr 10, 2015 7:47 pm UTC

Cradarc wrote:Exactly. In mathematics, there is consensus about what axioms to start from. In ethics there isn't.

The situations are not as different as you think they are. Once you go past the axioms of ZFC, there is disagreement over what axioms are proper foundations for mathematics. Forest Goose will probably be able to go into more detail about this.

However, while people may not agree on, say, the existence of Woodin cardinals, there are some things that they do agree on, such as the axioms of ZFC. So, we don't need complete agreement in order to do math. We can do math by reasoning from those assumptions on which there is consensus.

Something like that is true in ethics. Can you get everyone to agree with the premises of, say, Kant's foundational argument in the Critique of Practical Reason? No, but you don't need consensus on every matter in order to have some solid starting points. There is much wider agreement on claims like:
  • Happiness is better than suffering.
  • Some amount of selfishness is too much.
  • You shouldn't lie to people without good reason.
  • It's wrong to punish someone for a crime she did not commit.
  • If it's wrong to torture people for the sake of national security, it's wrong to torture people for fun.
  • It's wrong to torture people for fun.
Further, while any one of these claims is debatable, each one is at least initially plausible; their negations are not. An argument that says "Happiness is better than suffering; therefore, X" is a lot better of an argument than one that says "Suffering is better than happiness; therefore, -X."

So, there are points of substantial consensus in ethics that warrant our preference for some starting points over others. It is not the case that, when I write an ethics paper, I am equally entitled to open with "You shouldn't lie to people without good reason" or "You should lie to people whenever you feel like it." If I just take the latter claim as an undefended assumption, then I'm going to have a shitty paper, and I can't tell my professor "Sorry, I think that premise is just as good as every other premise."

If, then, you are defending your view by appealing to the arbitrariness of philosophical assumptions, you should reconsider.
Nothing rhymes with orange,
Not even sporange.

Cradarc
Posts: 448
Joined: Fri Nov 28, 2014 11:30 pm UTC

Re: Trolly Problem

Postby Cradarc » Fri Apr 10, 2015 7:50 pm UTC

Forest Goose wrote:I still maintain you haven't presented anything resembling a justification.

For what? I have justified my choice, except it eventually led to an axiom I could not justify. The reason I couldn't justify it is because you hold a different set of axioms.

Forest Goose wrote:I think the problem is that you don't have any real notion of the subject you are discussing

Please enlighten me. I thought I was discussing the relative morality of certain actions. Neither of us has established a common definition of morality, so that might be the problem.

Bolshevik,
I'm not defending my own view by saying ethics is arbitrary. I'm saying Forest Goose cannot say with logical certainty that his/her view is correct because ethics is arbitrary.
Remember the analogy with the mathematical conjecture? I am convinced it is false and Forest Goose is convinced it is true.

Forest Goose is saying the equivalent of (correct me if I'm wrong):
"The conjecture is true because I have proved it. I have proved it because you can't prove I didn't prove it."
The problem is I could (and probably did) say the exact same thing about my position:
"The conjecture is false because I have proved it. I have proved it because you can't prove I didn't prove it."

Which led to the new argument:
- "But I can prove that you didn't prove it."
- "No you can't. What you said wasn't a proof."
- "How is that not a proof? Can you prove that wasn't a proof?"
etc.

I would drop the subject if we can all agree nobody can say with objective certainty what the "best" course of action would be.
Last edited by Cradarc on Fri Apr 10, 2015 8:07 pm UTC, edited 1 time in total.
This is a block of text that can be added to posts you make. There is a 300 character limit.

User avatar
Forest Goose
Posts: 377
Joined: Sat May 18, 2013 9:27 am UTC

Re: Trolly Problem

Postby Forest Goose » Fri Apr 10, 2015 8:01 pm UTC

Cradarc wrote:For what? I have justified my choice, except it eventually led to an axiom I could not justify. The reason I couldn't justify it is because you hold a different set of axioms.


You are the one that made me state a position, I didn't need to to show you weren't saying anything (remember when you got all whipped up in a tizzy and started moaning about holes in analogies and Turing Machines? I do.)

The problem is not that we have different axioms, the problem is that you have an answer, can't support it, and, now, since you can't be "Just as right", you want everyone else to be "Just as wrong". In other words, the general mark of someone bitter and feeling entitled to "their voice" in academic subjects that they don't seem to know.


Please enlighten me. I thought I was discussing the relative morality of certain actions. Neither of us has established a common definition of morality, so that might be the problem.


Yes, you've done it, in that single sentence you've convinced me, I've forgotten all the nonsense and bad analogies and confusion over the nature of debate and justification and etc., that preceded it.
Forest Goose: A rare, but wily, form of goose; best known for dropping on unsuspecting hikers, from trees, to steal sweets.

Cradarc
Posts: 448
Joined: Fri Nov 28, 2014 11:30 pm UTC

Re: Trolly Problem

Postby Cradarc » Fri Apr 10, 2015 8:12 pm UTC

Forest Goose,
Call me stupid, but I cannot see how the things you said in your most recent post are related to the quotes you appear to be replying to.

This is what it sounds like:
"I have clearly proven this conjecture is true. You cannot disprove it. Even though you know this, you are still desperately trying to disprove it because you are bitter that I showed you couldn't prove the conjecture is false. The problem is you know nothing about this conjecture. You feel like you are capable of discussing it with us, but you don't."
Last edited by Cradarc on Fri Apr 10, 2015 8:23 pm UTC, edited 1 time in total.
This is a block of text that can be added to posts you make. There is a 300 character limit.

User avatar
gmalivuk
GNU Terry Pratchett
Posts: 25454
Joined: Wed Feb 28, 2007 6:02 pm UTC
Location: Here and There
Contact:

Re: Trolly Problem

Postby gmalivuk » Fri Apr 10, 2015 8:20 pm UTC

Cradarc wrote:I started from a set of premises: X
So you say, but you haven't yet actually told us what those premises are, you've just repeated more and more conclusions you draw from them.
Unless stated otherwise, I do not care whether a statement, by itself, constitutes a persuasive political argument. I care whether it's true.
---
If this post has math that doesn't work for you, use TeX the World for Firefox or Chrome

(he/him/his)

User avatar
TheGrammarBolshevik
Posts: 4878
Joined: Mon Jun 30, 2008 2:12 am UTC
Location: Going to and fro in the earth, and walking up and down in it.

Re: Trolly Problem

Postby TheGrammarBolshevik » Fri Apr 10, 2015 8:24 pm UTC

Cradarc wrote:Bolshevik,
I'm not defending my own view by saying ethics is arbitrary. I'm saying Forest Goose cannot say with logical certainty that his/her view is correct because ethics is arbitrary.

...

I would drop the subject if we can all agree nobody can say with objective certainty what the "best" course of action would be.

First, ethics isn't arbitrary, as I already discussed at some length in my previous post.

Second, you are mistaken if you think that "logical certainty" or "objective certainty" is the standard to which people have been holding your arguments.

But, if it makes you feel any better, I can reaffirm that nobody can say with objective certainty what the best course of action would be.
Nothing rhymes with orange,
Not even sporange.

Cradarc
Posts: 448
Joined: Fri Nov 28, 2014 11:30 pm UTC

Re: Trolly Problem

Postby Cradarc » Fri Apr 10, 2015 8:29 pm UTC

Gmalivuk,
Cradarc wrote:Whenever you get rid of something, you are implicitly saying "its existence is at my whim".
Why must the above be true? Well, it doesn't have to be. That is a fundamental belief.

If that doesn't sound like a premise, what does? I explicitly stated it is a fundamental belief.

If either you or Forest Goose had said
Ah, okay. Well, I don't accept that premise. That's why we are having problems understanding each other. We have fundamentally different ideas about human life and the act of killing.
the issue would be over.

But that's not what happened, is it?

Bolshevik,
If everyone can agree nobody is certain, I think it's more productive to put forth different ways to consider the problem rather than ask people to prove the ideas they are putting forth. Nobody can prove anything if nobody is certain.
Last edited by Cradarc on Fri Apr 10, 2015 8:39 pm UTC, edited 1 time in total.
This is a block of text that can be added to posts you make. There is a 300 character limit.

User avatar
TheGrammarBolshevik
Posts: 4878
Joined: Mon Jun 30, 2008 2:12 am UTC
Location: Going to and fro in the earth, and walking up and down in it.

Re: Trolly Problem

Postby TheGrammarBolshevik » Fri Apr 10, 2015 8:38 pm UTC

Sorry, isn't "I don't accept that premise" exactly what I said?

Sure, instead of saying that we have "fundamentally different ideas about human life" and leaving it at that, I actually explained why the premise is false. But so what? It's not like you're allowed to just decide whether other people get to criticize your premises or not.
Nothing rhymes with orange,
Not even sporange.

Cradarc
Posts: 448
Joined: Fri Nov 28, 2014 11:30 pm UTC

Re: Trolly Problem

Postby Cradarc » Fri Apr 10, 2015 8:42 pm UTC

No, that simply isn't true, nor is it true in the case of killing one to save many.

is not exactly "I don't accept that premise." It sounds more like "Nobody can accept that premise".

The key nuance is the way you phrased it made it seem like you are arguing for something universal. The justification did not sound like an explanation of why you disagree, but an argument for why nobody would agree.
If you were making a universal argument, I am obligated to make a rebuttal because I am included in that claim. It's like me saying I like the color blue, and you saying "blue is simply an unlikeable color because..." instead of saying "I don't like the color blue because...".
This is a block of text that can be added to posts you make. There is a 300 character limit.

User avatar
Forest Goose
Posts: 377
Joined: Sat May 18, 2013 9:27 am UTC

Re: Trolly Problem

Postby Forest Goose » Fri Apr 10, 2015 8:50 pm UTC

Nothing is absolutely certain, therefore we can challenge nothing, therefore you cannot challenge me, therefore I can say what I want unchallenged.

Sorry, you're not looking for philosophy, you're looking to not be questioned - I have snarky things to say, but, honestly, I don't really care anymore.

Look, my problem isn't "I disagree" (I stress this, if you are right, then I agree with you, that follows from the core premise I put forward), my problem is that you are just stating what you want, you haven't supported anything. I guess, if you want to say, "I assume this, that is all, I have nothing else to support it", then fine, be my guest, but don't expect anyone else to agree, don't expect people to not criticize your idea anyways, and don't expect anyone, in general, to care. You want to conduct in a vacuum, do so, live that life, but, then, why are you still lingering around here, grinding your ax and trying to drag logic and ethics down to the murky depths of your need to "win"?
Forest Goose: A rare, but wily, form of goose; best known for dropping on unsuspecting hikers, from trees, to steal sweets.

User avatar
gmalivuk
GNU Terry Pratchett
Posts: 25454
Joined: Wed Feb 28, 2007 6:02 pm UTC
Location: Here and There
Contact:

Re: Trolly Problem

Postby gmalivuk » Fri Apr 10, 2015 8:59 pm UTC

Cradarc wrote:Gmalivuk,
Cradarc wrote:Whenever you get rid of something, you are implicitly saying "its existence is at my whim".
Why must the above be true? Well, it doesn't have to be. That is a fundamental belief.

If that doesn't sound like a premise, what does? I explicitly stated it is a fundamental belief.

That is a "fundamental belief" in much the same way that "the square root of two is irrational" is a fundamental belief.

Which is to say, it's not at all. You might start a new proof with its irrationality as a given, because you assume everyone reading the proof has already been convinced by other proofs that it is, in fact, irrational, but that doesn't make it a fundamental belief, nor does simply proclaiming "that is a fundamental belief".
Unless stated otherwise, I do not care whether a statement, by itself, constitutes a persuasive political argument. I care whether it's true.
---
If this post has math that doesn't work for you, use TeX the World for Firefox or Chrome

(he/him/his)

Cradarc
Posts: 448
Joined: Fri Nov 28, 2014 11:30 pm UTC

Re: Trolly Problem

Postby Cradarc » Fri Apr 10, 2015 11:18 pm UTC

I do not feel a need to win. In fact, earlier I apologized for getting mad even though I thought you guys were the one being aggressive. I don't expect an apology or concession from anyone I just want to be able to propose different ways to see the problem without someone constantly saying "Stop repeating the same thing, you're wrong".

I'm lingering around here to look at different viewpoints. I think a person can gain knowledge from conflicting viewpoints while still holding tightly to their own. I am interested in learning about perspectives different than mine, and expect others to be interested in learning about mine. I like to share my ideas to encourage thought in others and (hopefully) they will then share theirs.
People who enjoy quashing the expression of controversial ideas make me angry. If you don't think that person is giving you any new insight, then ignore them. They still might trigger new thoughts in others. There's no need to shut someone down simply because you think they're full of crap.
Me putting forth my ideas is not a threat to yours. It's much more productive (not to mention respectful) to expand on your own ideas rather than shutting down those of others.

That being said, if you really think I am not a reasonable person, then you can either ignore me or leave. I have created a different thread where you can conduct essentially the same conversation with the guarantee that I won't be posting my opinion. Or, if you prefer, I could migrate to another thread and you can stay on this one. I don't care.
It's obvious that the mere presence of my viewpoint annoys you, so this would be the most practical solution short of a mod censoring one person over another.
This is a block of text that can be added to posts you make. There is a 300 character limit.

User avatar
gmalivuk
GNU Terry Pratchett
Posts: 25454
Joined: Wed Feb 28, 2007 6:02 pm UTC
Location: Here and There
Contact:

Re: Trolly Problem

Postby gmalivuk » Fri Apr 10, 2015 11:26 pm UTC

Questioning your viewpoint is not "quashing" it.
Unless stated otherwise, I do not care whether a statement, by itself, constitutes a persuasive political argument. I care whether it's true.
---
If this post has math that doesn't work for you, use TeX the World for Firefox or Chrome

(he/him/his)

User avatar
Qaanol
The Cheshirest Catamount
Posts: 3031
Joined: Sat May 09, 2009 11:55 pm UTC

Re: Trolly Problem

Postby Qaanol » Fri Apr 10, 2015 11:38 pm UTC

Cradarc wrote:Morriswalters,
That is an interesting question. In that scenario, by following the orders of the father, I am consciously making myself a proxy for whatever he is doing.
He chose to pull the lever to kill himself in order to save his daughter, but he could not successfully perform that action without involving me.

So the question is "Is it right for me to do something I think is immoral if someone else thinks it is moral?". The answer is no, because I think my morality trumps his morality. This is not arrogance, but follows logically. If I thought his morality is better, why would I have my own sense of morality? I would simply adopt his. Whatever my moral code is, it is by definition, what I think is the best.
Now if he had direct access to the lever, I would think his decision is immoral, but I would not stop him. This is because the moral principles that govern my own decisions are different than the moral principles that govern how I respond to other people's decisions.

Just to be abundantly clear, it sounds like you are saying that what almost everyone would call ”noble self-sacrifice” on the part of the father to save the daughter, is immoral to you.

Before continuing, I’ll make a meta-comment here and recommend that you, Cradarc, read about the backfire effect. It is a real thing, and worth being aware of. Worth reading before the rest of my post, or even instead of my post!

Now back on the original trolley problem, your objection as I understand it boils down to the fact that pulling the lever puts a person in danger who was not in danger before. But what if we modify it slightly?

Spoiler:
Suppose there are, as usual, 5 people on the primary track and 1 person on the alternate track, and you know this quite certainly. However, they are all around the next bend, so you cannot see who is on which track and you have no way to find out. As far as you know, each person has a 5/6 chance of being on the primary track, and a 1/6 chance of being on the other track.

For each person on the track, if you pull the lever you reduce their risk from an 83% chance of dying to a 17% chance of dying. The act of pulling the lever makes each of them individually 5 times safer. Using the best information available to you, the most you can conclude is that every person on the track will be 5 times less likely to get run over by a trolley if you pull the lever.

Heck, let’s name the people. Allison, Brian, Claire, Dustin, Eleanor, and Francis are stuck on the track. You know that five of them are on the main track and one of them is on the other track, but you don’t know who is where. The most accurate determinations you can make are:

If you pull the lever, Allison will be 5 times more likely to live.
If you pull the lever, Brian will be 5 times more likely to live.
If you pull the lever, Claire will be 5 times more likely to live.
If you pull the lever, Dustin will be 5 times more likely to live.
If you pull the lever, Eleanor will be 5 times more likely to live.
If you pull the lever, Francis will be 5 times more likely to live.

To the best of your knowledge, the act of pulling the lever makes each of them safer. These six people are the only ones who will be affected.

If your moral system prohibits you from taking an action which categorically reduces the risk of everyone dying across the board, with no other side effects, then I reject the idea that it can properly be called a “moral” system. So I will assume you agree that pulling the lever is the right thing to do—or at least not the wrong thing to do—in the case where you don’t know which person is where.

From there, it is hardly even a modification to suppose that you can see the people, but they are a set of identical sextuplets. Again we can say you know all of their names, but you cannot tell them apart. So you don’t know which one is on the alternate track. To the best of your knowledge, for each individual member of the sextuplets, pulling the lever lowers their risk from 83% to 17%. In all cogent regards, this situation is identical to the previous. The only difference is that your line of sight extends to the people on the track—but they are all indistinguishable anyway!

In any case, I have greatly enjoyed this thread, and it has in fact helped me to clarify and coalesce my own idea of what morality entails. And I have concluded that:

Morality is the way we would like strangers to treat one another.
wee free kings

elasto
Posts: 3052
Joined: Mon May 10, 2010 1:53 am UTC

Re: Trolly Problem

Postby elasto » Sat Apr 11, 2015 12:20 am UTC

Cradac: I would also like to know your views on the two scenarios I put forward here that it's possible you missed.

To summarize:

(1) In a few decades we will have self-driving cars everywhere. One thing that will have to be coded is how they react in an emergency. I outlined a scenario where the brakes have failed and the car is about to plow through a red light and hit dozens of people who are crossing the road. Or it could steer onto the sidewalk and hit a single pedestrian. If you were in charge of coding the car's AI would you really code it not to turn, to shrug it's metaphorical shoulders and let physics take its course?

(2) You are flying a plane that has suffered a near total systems failure. If you do nothing you'll crash into a high-density area and have a high probability of killing a high number of people. If you lower a flap you'll send the plane into a spin that will crash into a low-density area and have a high probability of killing a low number of people. Do you refuse to lower the flap?

And, finally, what answers would you prefer that other people make to these questions? Would you really prefer that others make the choice to 'let nature take its course' even though that means that, on average, greater numbers die in each scenario - meaning there's a greater chance you'll die?

morriswalters
Posts: 6553
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: Trolly Problem

Postby morriswalters » Sat Apr 11, 2015 12:56 am UTC

I would suggest that Cradarc's list is consistent, if you start with a belief that you can't value one life over any other and nothing else. A conscientious objector might make a similar argument. That fact that Cradarc takes it to an extreme is something else. But if there is no cost, ie, that he never has to actually do it, then it is what it is. Making the choice other than a binary may expose a inherent inconsistency, I don't know. I was trying to find out. This type of argument is a Poster Child for an antiabortion position. In a choice between the mother and the child the mother would always lose, even if there was a risk to life. Cradarc ruled this out in his argument however.

For the record I have a family member who believes something very close to this. He spent 2 years in a Federal Penitentiary for refusing the draft after he couldn't get CO status. I took that as a sign that he really, really believed in it. I suspect that had you threatened his wife and child he would have been sorely tested though. Qaanol I'm glad you got something out of it.

Elasto's number 1 is my favorite alternate scenario because it is truly relevant, that is someone will have to make the choice before self driving cars hit the road.

User avatar
TheGrammarBolshevik
Posts: 4878
Joined: Mon Jun 30, 2008 2:12 am UTC
Location: Going to and fro in the earth, and walking up and down in it.

Re: Trolly Problem

Postby TheGrammarBolshevik » Sat Apr 11, 2015 1:26 am UTC

Cradarc wrote:If everyone can agree nobody is certain, I think it's more productive to put forth different ways to consider the problem rather than ask people to prove the ideas they are putting forth. Nobody can prove anything if nobody is certain.

I think you are misunderstanding what other people are asking for. You are right that it would be unproductive to ask for proof, or to look for certainty in our conclusions. However, besides proving that something is true, it's also possible to rationally support it. What most of us expect in a discussion is rational support for one's positions. This doesn't mean that you have to prove what you're saying. But it does mean that you can just say whatever you think, and expect that to be good enough for the sake of discussion.

Cradarc wrote:The key nuance is the way you phrased it made it seem like you are arguing for something universal. The justification did not sound like an explanation of why you disagree, but an argument for why nobody would agree.
If you were making a universal argument, I am obligated to make a rebuttal because I am included in that claim. It's like me saying I like the color blue, and you saying "blue is simply an unlikeable color because..." instead of saying "I don't like the color blue because...".

OK, I think you're right that it would be absurd to argue that way about the color blue, but switch the example up.

Suppose someone were to say "I don't think smoking causes cancer." Wouldn't it make sense to say "No, you're wrong" and then show them the evidence that they're wrong? Wouldn't it be silly for them to say "No, you don't understand - for me smoking doesn't cause cancer, but I don't mean that as a universal statement"?

Liking blue or not is something about me. If I like the color blue, then nothing else about the world can make it wrong for me to like the color blue. But whether smoking causes cancer isn't about me. If smoking causes cancer, then it's going to cause cancer whether or not I believe it does. If I were to say that smoking doesn't cause cancer, I would just be wrong. There's no way to take that statement as just something about my own beliefs or preferences; it is, to use your terminology, a universal claim.

What I'm getting out of this is that you think moral claims are like "I don't like blue," and not like "Smoking causes cancer." They're things that might be true or false for each individual person, depending on their moral beliefs, rather than things - like smoking and cancer - that depend for their truth and falsehood on something besides what the people making the claims think.

However, why assume that morality is like this? Isn't it worth considering the possibility that moral claims are true or false independently of what people think about them?

If you are interested in this, I'd be happy to point to some books and articles which discuss the issue.
Nothing rhymes with orange,
Not even sporange.

Cradarc
Posts: 448
Joined: Fri Nov 28, 2014 11:30 pm UTC

Re: Trolly Problem

Postby Cradarc » Sat Apr 11, 2015 2:35 am UTC

First let me put out my definition of morality:
Morality describes the consistency of a thought or action with respect to a set of principles and values furnished by a judge.
The "judge" is the entity that is evaluating the morality.

A computer program can be used to define morality. You would need a way to feed it information, and it would need a way to reach a decision: moral, immoral, or amoral. In the case of a computer, the principles and values are most likely influenced by those of the programmer.
Every human (with a notion of morality) has a set of personal principles and values upon which they establish what is moral and what is not. Being human, those principles and values can change over time. In addition, a person's interpretation of their own morality can be clouded by emotions (ie. we choose to tell ourselves "this is moral" when the actual comparison yielded "immoral").

Please read the above and digest it thoroughly. Hopefully this will save people some time/effort when trying to understand my position.



Qaanol,
I'm glad you got something out of this mess. Thanks for the note about the backfire effect. I guess I am too caught up in the argument as well.

About your scenario:
My principles dictate I should not willingly cause the death of a person. If I don't flip the lever, who/whatever caused the trolley to go down the track caused the death of some number of people. If I flip the lever, I will cause the death of either 1 or 5 people. So to stay consistent with my principle, I will not flip the lever.

The question people keep asking me is: Why aren't you responsible for deaths when you consciously chose to not flip the lever?
The reason is this premise: If I am responsible for an event, then that event will not occur if I did not exist.
This idea should be pretty intuitive for anyone into scientific study. An object falls to the ground in the dark -> gravity is responsible. An object falls to the ground in sunlight, is sunlight now responsible? No. Sunlight is clearly part of the system now, but it is not doing anything to affect the trajectory of the object.


Elasto,
For #1:
Disclaimer: I'm not a supporter of self-driving cars that do not have manual override.
Unless there are people literally everywhere, there must be somewhere I can turn that would increase the chance of not killing anybody.
I would make the AI seek out the direction that will cause the least harm.

For #2:
This is similar to the car scenario. The choice is not binary. I will steer and try to land the plane to minimize the loss of life.

I think the trolley equivalent is:
I am asked to flip the switch back and forth many times without knowing what the switch does. I do so and leave it at a random position. I am then shown the situation. What do I do?
Because I interacted with the switch, I have injected myself as a player in the system. I would then be responsible regardless if I flip the switch or not.
This is a block of text that can be added to posts you make. There is a 300 character limit.

User avatar
TheGrammarBolshevik
Posts: 4878
Joined: Mon Jun 30, 2008 2:12 am UTC
Location: Going to and fro in the earth, and walking up and down in it.

Re: Trolly Problem

Postby TheGrammarBolshevik » Sat Apr 11, 2015 2:44 am UTC

Cradarc wrote:First let me put out my definition of morality:
Morality describes the consistency of a thought or action with respect to a set of principles and values furnished by a judge.
The "judge" is the entity that is evaluating the morality.

Why assume that morality is like this? Isn't it worth considering the possibility that moral claims are true or false independently of what people think about them?
Nothing rhymes with orange,
Not even sporange.

User avatar
Forest Goose
Posts: 377
Joined: Sat May 18, 2013 9:27 am UTC

Re: Trolly Problem

Postby Forest Goose » Sat Apr 11, 2015 2:54 am UTC

Cradarc wrote:The reason is this premise: If I am responsible for an event, then that event will not occur if I did not exist.


A thousand people will dies a horrible slow painful death by radiation poisoning unless you push a button; pushing the button does nothing but save the people. You walk away and do not push the button, did you do something wrong? (Note: This is not, "What would you do?", but "Are you immoral if you do this")

By your logic, you are not responsible - if you want to remain consistent with what is stated, as stated, you can, at most say, "It would not be morally right", you do not seem able to say, "It is morally wrong to not push it". So, would you agree that there is nothing morally wrong with just walking away in the above case?

This idea should be pretty intuitive for anyone into scientific study. An object falls to the ground in the dark -> gravity is responsible. An object falls to the ground in sunlight, is sunlight now responsible? No. Sunlight is clearly part of the system now, but it is not doing anything to affect the trajectory of the object.


If sunlight had the option to stop the object in mid fall and consciously decided, "I will allow this to fall", then there is a good sense of the word "responsible" for which, "Sunlight is partially responsible for the object falling" that would be apt.
Forest Goose: A rare, but wily, form of goose; best known for dropping on unsuspecting hikers, from trees, to steal sweets.

Cradarc
Posts: 448
Joined: Fri Nov 28, 2014 11:30 pm UTC

Re: Trolly Problem

Postby Cradarc » Sat Apr 11, 2015 3:02 am UTC

Forest Goose,
If I don't push the button, I am not responsible for the death of all those people. That doesn't mean I didn't act immorally. If you had cared enough to understand my position instead of trying to find fault in it, you would have remembered that.

At this point, I'm not even going to bother explaining it to you again.

Bolshevik,
Yes, morality could be independent of what people think. But as the debate demonstrated, there is no value in asserting that position because the natural question would be "What is this absolute morality"? Everybody has their own idea of what that morality is, but nobody can logically prove it is the absolute one.
It's like asserting time exists without motion or changes. Sure, it could be true (and in fact, i do believe in absolute time), but it has no physical value. That's why physics is happy with time being relative. That assumption lets you predict stuff.
This is a block of text that can be added to posts you make. There is a 300 character limit.

User avatar
Forest Goose
Posts: 377
Joined: Sat May 18, 2013 9:27 am UTC

Re: Trolly Problem

Postby Forest Goose » Sat Apr 11, 2015 3:14 am UTC

Cradarc wrote:Forest Goose,
If I don't push the button, I am not responsible for the death of all those people. That doesn't mean I didn't act immorally. If you had cared enough to understand my position instead of trying to find fault in it, you would have remembered that.

At this point, I'm not even going to bother explaining it to you again.


...Then your whole statement about "I'm not responsible" doesn't really matter, I'm not seeing how any of that precludes you from acting immorally, yet not being responsible, by not pulling the lever. In other words, if you are immoral, but not responsible, in the above case, it would seem to indicate that there is something more, then, involved in the trolley case than what you just stated that makes it acceptable. (You're giving one reason, but that reason doesn't cover the very thing you're linking it with, apparently, since it doesn't preclude immorality in this case). (And, for the record, nothing you've said establishes you aren't responsible in a meaningful moral sense).

Yes, morality could be independent of what people think. But as the debate demonstrated, there is no value in asserting that position because the natural question would be "What is this absolute morality"? Everybody has their own idea of what that morality is, but nobody can logically prove it is the absolute one.


Yes, quantum gravity could be independent of what people think. But as debates have demonstrated, there is no value in asserting that position because the natural question would be "What is this absolute quantum gravity"? Everybody has their own idea of what quantum gravity is, but nobody can logically prove it is the absolute one.

Shall I call the particle physicists and tell them the hunt is off, or shall you? (Are you aware that people make this argument, actually, about science for the same type of reasons...)
Forest Goose: A rare, but wily, form of goose; best known for dropping on unsuspecting hikers, from trees, to steal sweets.

Cradarc
Posts: 448
Joined: Fri Nov 28, 2014 11:30 pm UTC

Re: Trolly Problem

Postby Cradarc » Sat Apr 11, 2015 4:00 am UTC

Here's a question for everyone to consider:
Suppose a self-aware robot flips the lever very quickly and stops at a seemingly random time. Is the robot moral when it leaves the lever in the flipped position? Is it immoral when it leaves the lever in the unflipped position?

Forest Goose,
Okay, I concede that the decision in the trolley problem is not moral in the sense that it did violate one of my moral principles. The trolley problem forces at least one of my moral principles to be broken, so I guess a better description is "less immoral". The less moral option out of two options is technically not moral. You are correct.
However, it is the better course of action out of only two options to choose from. So in my world, an ideal, "moral" person would pick that option.

The truly moral thing to do would be to not kill anyone and save everyone that needs saving. That is simply not possible in any remotely realistic situation. A person always has to make sacrifices based on what he/she thinks is more important.
This is a block of text that can be added to posts you make. There is a 300 character limit.

User avatar
Qaanol
The Cheshirest Catamount
Posts: 3031
Joined: Sat May 09, 2009 11:55 pm UTC

Re: Trolly Problem

Postby Qaanol » Sat Apr 11, 2015 4:10 am UTC

Cradarc wrote:First let me put out my definition of morality:
Morality describes the consistency of a thought or action with respect to a set of principles and values furnished by a judge.
The "judge" is the entity that is evaluating the morality.

That is…not at all what most people mean by the word “morality”. A better word would be, well, “consistency”. To wit:

The set of principles I am given is the axioms of Peano arithmetic. My thought is “2 + 2 = 6”. Your definition would claim that I am immoral for thinking that.

But this mathematical situation is entirely outside the scope of what people mean by the word “moral”. That word is simply inapplicable to it.

And if I was given the principle “Humans are ruining the only planet Earth” and the values “Human = 0, Earth = 1” then your definition would call “moral” the action “Destroy humanity”. That is a serious zeroth-law violating problem. It does not reflect what people actually mean by “morality”, despite being entirely consistent.

It is important to use words to mean the things that those words actually mean, for the purpose of communicating meaning through words. When I stated what morality meant to me, I was essentially saying, “I have in my mind a concept of the meaning that is carried by the word ‘morality’ as it is generally used, and here is my best effort to describe that meaning clearly and cogently.”

Your definition is far too broad for what people mean by “morality”.
wee free kings

User avatar
ahammel
My Little Cabbage
Posts: 2131
Joined: Mon Jan 30, 2012 12:46 am UTC
Location: Vancouver BC
Contact:

Re: Trolly Problem

Postby ahammel » Sat Apr 11, 2015 4:31 am UTC

Cradarc wrote:Here's a question for everyone to consider:
Suppose a self-aware robot flips the lever very quickly and stops at a seemingly random time. Is the robot moral when it leaves the lever in the flipped position? Is it immoral when it leaves the lever in the unflipped position?

I was about to say that if the morally right thing is to pull the lever, then the morally right thing is not to decide at random whether or not too pull the lever. But then I noticed that the switch-flipping is only "seemingly" random. I suppose the answer would be different depending on whether the explanation for the robot's behaviour is "it was indecisive", " it wanted to freak out the people on the tracks", or "it was hoping to derail the trolley, thereby saving all six of them".
He/Him/His/Alex
God damn these electric sex pants!

morriswalters
Posts: 6553
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: Trolly Problem

Postby morriswalters » Sat Apr 11, 2015 11:10 am UTC

Qaanol wrote:That is…not at all what most people mean by the word “morality”.
Replace judge with "God".

elasto
Posts: 3052
Joined: Mon May 10, 2010 1:53 am UTC

Re: Trolly Problem

Postby elasto » Sat Apr 11, 2015 11:13 am UTC

Cradarc wrote:Elasto,
For #1:
Disclaimer: I'm not a supporter of self-driving cars that do not have manual override.
Unless there are people literally everywhere, there must be somewhere I can turn that would increase the chance of not killing anybody.
I would make the AI seek out the direction that will cause the least harm.

For #2:
This is similar to the car scenario. The choice is not binary. I will steer and try to land the plane to minimize the loss of life.

Cool. So if we re-describe the trolley scenario just very slightly:

- If you don't pull the lever there's an extremely high probability of five people dying
- If you pull the lever there's an extremely high probability of one person dying

Do you see how this is now an extremely close equivalent to the plane/car scenarios?

And yet do you see that performing an action with a ~99.9% probability of someone losing their life (steering towards the pedestrian on the pavement, steering the plane towards low-density housing - both of which you say you'd do) can't have a different morality to performing the same action with a 100% probability of someone losing their life (pulling the lever as originally posed)?

Or does that extra ~0.1% chance of no loss of life now somehow mean it's not 'you' responsible for the death but 'the universe' through 'bad luck' and so your action flips from 'immoral' to 'moral'?

morriswalters
Posts: 6553
Joined: Thu Jun 03, 2010 12:21 am UTC

Re: Trolly Problem

Postby morriswalters » Sat Apr 11, 2015 1:08 pm UTC

TheGrammarBolshevik wrote:
Cradarc wrote:First let me put out my definition of morality:
Morality describes the consistency of a thought or action with respect to a set of principles and values furnished by a judge.
The "judge" is the entity that is evaluating the morality.

Why assume that morality is like this? Isn't it worth considering the possibility that moral claims are true or false independently of what people think about them?
Possibly true if you are considering a raison d'être for morals, why for the moral claims themselves? The difference would appear to be analogous to the law of gravity and apples falling up. So not why are there moral claims at all, rather, what moral claim is always true? The trolly problem is about how we choose, not if either act is moral. All the answers are related to what we believe about what we know.

User avatar
slinches
Slinches get Stinches
Posts: 924
Joined: Tue Mar 26, 2013 4:23 am UTC

Re: Trolly Problem

Postby slinches » Sat Apr 11, 2015 4:33 pm UTC

elasto wrote:And yet do you see that performing an action with a ~99.9% probability of someone losing their life (steering towards the pedestrian on the pavement, steering the plane towards low-density housing - both of which you say you'd do) can't have a different morality to performing the same action with a 100% probability of someone losing their life (pulling the lever as originally posed)?

Or does that extra ~0.1% chance of no loss of life now somehow mean it's not 'you' responsible for the death but 'the universe' through 'bad luck' and so your action flips from 'immoral' to 'moral'?

I think from what I've seen so far, Cradarc's position (please correct me if I'm wrong) is that the difference between the scenarios is that in the plane and car scenarios the decision maker has already accepted the responsibility of deciding who lives and who dies. That isn't true in the Trolley Problem. By accepting that role, you become directly responsible for intentionally causing someone's death and that is an immoral act, no matter how many people are saved (ends don't justify the means). Therefore, the equivalent of the Trolley problem is that you're a passenger in the crashing plane/car and the pilot/driver is somehow incapacitated. Is it immoral to not take the controls and try to minize the damage?

My own opinion is not terribly different from that. I agree that recusing yourself from such decisions is morally neutral as long as someone else equally capable of performing the task is willing and available to step in.

elasto
Posts: 3052
Joined: Mon May 10, 2010 1:53 am UTC

Re: Trolly Problem

Postby elasto » Sat Apr 11, 2015 8:59 pm UTC

slinches wrote:I think from what I've seen so far, Cradarc's position (please correct me if I'm wrong) is that the difference between the scenarios is that in the plane and car scenarios the decision maker has already accepted the responsibility of deciding who lives and who dies. That isn't true in the Trolley Problem. By accepting that role, you become directly responsible for intentionally causing someone's death and that is an immoral act, no matter how many people are saved (ends don't justify the means). Therefore, the equivalent of the Trolley problem is that you're a passenger in the crashing plane/car and the pilot/driver is somehow incapacitated. Is it immoral to not take the controls and try to minize the damage?

Yes it is.

In all three scenarios you are the only person in a position to make a decision. Whether you 'accept responsibility of deciding who lives and who dies' is irrelevant: The responsibility has been forced upon you like it or not.

In all three cases it's an immoral choice to cause the death of innocents; however in all three cases it'd be more of an immoral choice to shrug your shoulders and permit the death of far more innocents.

The simple test is 'what would you prefer that someone else chose to do when forced into the same position?' Would you not agree that in all three cases you'd prefer that they took the action that results in the least loss of life?

Sometimes there is no right thing to do; There is only a least wrong thing.


Return to “Serious Business”

Who is online

Users browsing this forum: No registered users and 13 guests