I was gonna just do the one but they do say it’s best to pay it forward when you can.
Loop continues until entire human population tied to track and there’s nobody left to pass the switch to. kill the scapegoat on round one and done
Oh, 100%. Fuck the next generation, I mean person.
What if I want to be the person down the line?
Welcome to climate policy.
That implies that if nobody tries to stop climate change, it’ll never destroy the world.
Double it. Then the other guy will double it, and so on. Infinite loop = no deaths.
And then there’s some psycho on round 34 who kills all 8 billion people alive on earth.
Eventually everyone is tied to the tracks and there’s no one left to change the trolley’s course.
Who said there was a limit?
Thats actually a really good dilemma if you think about it. Like if everyone doubles it you basically don’t kill anyone. But you’ll always risk that there’s some psycho who likes killing and then you will have killed more. And if these choices continue endlessly you will eventually find someone like this. So killing immediately should be the right thing to do.
This is really the only answer. The only thing that makes it “hard” is having to face the brutality of moral calculus
Now, what if you’re not the first person on the chain? What if you’re the second one. Or the n one? What now? Would you kill two or n knowing that the person before you spared them?
The thing to do is kill now even if it’s thousands. Because it’s only going to get worse.
The best time to kill was the first trolly. The second best time to kill is now.
Yes, but it also kinda depends on what happens at and after junction 34, from which point on more than the entire population of earth is at stake.
If anything, this shows how ludicrously fast exponentials grow. At the start of the line it seems like there will be so many decisions to be made down the line, so there must be a psycho in there somewhere, right? But (assuming the game just ends after junction 34) you’re actually just one of 34 people, and the chance of getting a psycho are virtually 0.
Very interesting one!
It’s not that interesting. If you rephrase the question as a choice between a good option and a less good one, it’s still barely even a choice.
“Would you rather have only one (or, say, trillions) die now, or would you like to allow *at a minimum *twice that many people die the second we talk to a sadist?”
If you can’t choose the smaller number, all it means is that you lack moral strength - or the test proctor has put someone you know on the tracks, which is cheating. A highly principled person might struggle if choosing between their daughter and one other person. If it’s my kid versus a billion? That’s not a choice, that’s just needless torture. Any good person would sacrifice their kid to save a billion lives. I take that as an axiom, because anything else is patently insane.
Kill fewer people now is obviously the right answer, and not very interesting.
What is interesting is that the game breaks already at junction 34, which is unexpectedly low.
So a more interesting dilemma would have been “would you kill n people now or double it and pass it on, knowing the next person faces the same dilemma, but once all humanity is at stake and the lever is not pulled, the game ends.”. Because that would involve first of all figuring out that the game actually only involves 34 decisions, and then the dilemma becomes “do I trust the next 33-n people not to be psychos, or do I limit the damage now?”. Even more interestingly “limiting the damage now” makes you the “psycho” in that sense…
Some day it reaches a person that thinks…
Well, 4 billion people less is better than someone being able to wipe out humanity…
(it would also solve many problems lol)
(and that point would be after 32 people had the choice…)
Meanwhile Thanos is on the third switch and very frustrated. (He would double it and pass it to the next person - there’s no point in killing four people when there’s a chance that the second-to-last guy might kill half of humanity.)
Thanos waiting patiently in line 💀
Eventually there might also be a track with no people on it so postponing the dilemma becomes much better than at least 1 death. But there is no way of knowing what the future dilemma might be.
At some people you will run out of people to tie to the tracks.
How many branches is that going to take? Just out of interest.
Only 32
Continuously double it so that the trolley has as much room as it needs to brake to a complete halt, therefore killing 0 people.
But it only takes 1 idiot to ruin the whole thing.
The real questions are, “Who is fueling and piloting the trolly, and can we kill them?”
You gotta double it until it overflows to negatives, then you end up reviving billions of people!
Year 2k38, right?
You would need a crazy low probability of a lunatic or a mass murderer being down the line to justify not to kill one person
Edit: Sum(2^n (1-p)^(n-1) p) ~ Sum(2^n p) for p small. So you’d need a p= (2×2^32 -2) ~ 1/(8 billion) chance of catching a psycho for expected values to be equal. I.e. there is only a single person tops who would decide to kill all on earth.
You don’t even need a lunatic or mass murderer. As you say, the logical choice is to kill one person. For the next person, the logical choice is to kill two people, and so on.
It does create the funny paradox where, up to a certain point, a rational utilitarian would choose to kill and a rational mass murderer trying to maximise deaths would choose to double it.
It’s always “double it” Anyone after 34 flips the kill all humans, that’s their fault not yours
Why do you care whose fault it is? You’d want to minimise human deaths, not win a blame game.
Doubling action forever minimizes human deaths.
Unless someone decide to hit kill. In that case, it’s them doing it. I’m invalidating the argument that pre-empting imaginary future mass murders justifies killing one person today.
Idk which moral system you operate under, but I’m concerned with minimising human suffering. That implies hitting kill because chances of a mass murderer are too high not to. You also don’t follow traffic laws to a t, but exercise caution because you don’t really care whose fault it ends up being, you want to avoid bad outcomes (in this case the extinction of humankind).
My moral system somehow does not chose to kill people through action against an imagined threat and is therefore objectively superior as is it not susceptible to hostile memetic manipulation (Molloch, Pascal’s wager, Pascal’s mugging, basilisks, social hysteria etc.) and is capable of escaping false choices and other contrived scenarios, breaking premise and the rules of the game as needed to obtain the desired outcome.
Well what about the fact that after 34 people the entire population is tied to the tracks. What are the chances that one person out of 35 wants to destroy humanity?
Also thing the entire human population to the tracks is going to cause some major logistical problems, how are you going to feed them all?
Oh come on. A trolley is not going to have the momentum to kill that many people nor would the machinery make it through. The gears and whatnot would be totally gummed up after like 20 or so people.
Lever half way and it crashes.
Attempting to subvert the thought experiment only makes things worse. The trolley is full of child prodigies, all future geniuses that will cure cancer and solve the world’s problems. By sticking the lever halfway you kill all of them. The only way to save the child prodigies is to choose, left or right.
You couldn’t even bother putting in adult scientists that have already helped the world. It’s a hypothetical scenario, you know, you can put in anyone you want. So I’m putting the child prodigies to a test by having the save themselves from the half-lever. Should be relatively easy for them.
Might hit the 2nd guy with a lever and the peeps behind him depending on speed.
It might. Still better odds.
Also interesting: What would you choose here if you were an evil psychopath? (Asking for an acquaintance.)
Switch the track from the bottom to the top as the train is half way over the switch, causing the train to drift across both rails hitting all three tied up people and the second switch operator.
this is not a purely theoretical question. in practice, autonomous vehicles face exactly this dilemma. or rather the manufacturers of the vehicles who have to set the specifications
I forget where it was from but years ago I found an online survey on autonomous cars and their decision making from a university. It was all about deciding to swerve or not in a collision. All kinds of difficult encounters like do you hit the barrier and kill the passenger or swerve and kill the old lady? Do you hit thin person or serve and hit the heavier person?
I’ve never seen a survey drill down into biases quite so deeply.
I did this as a part of our ethics discussion.
My eventual answer was you always kill the non-driver as no one would ever buy a car that will kill them over someone else.
Easy. Prioritize who is saved based on social credit score.
What does “double it” mean? Double what?
Can I move the rails to kill them all and then circle around and hit me?