Question 1: If forced to choose, which of these nightmare scenarios would you prefer?
Scenario A: An evil alien flips a coin. If it comes up heads, he destroys all human life; otherwise he goes home.
Scenario B: The same evil alien flips 7 billion coins, one for each person on earth. He destroys anyone whose coin comes up heads.
…It is pretty much impossible to take a coherent stand on issues ranging from Social Security reform to environmental conservation without first deciding how much we are obligated to care about future generations.
What the economics professor, all-around smart person and serial polemicist Steven Landsburg means is that it is “impossible to take a coherent stand” without first having answered the above question about the alien. Of course, Landsburg is correct, but (like a lot of things he is correct about) it makes me crazy to think so. And I can’t get more than a few minutes into thinking about his [expletive] alien question without my head starting to hurt. If I was, in fact, really having to stand up to an all-powerful alien and answer a question like that, it would be worse (i.e., pants starting to be soiled, etc.), so how can my answer now provide any insight into my views on public policy?
Vanity Fair had a poll question not long ago, asking whether you would kill a beloved pet for a payment of $1 million. Economists — and I am assuming/including Landsburg — believe that if you respond, “No, I wouldn’t,” to that question that you (absurdly) value your pet at $1 million.
It is one thing to boil a decision down to the lean, hard numbers and cruelly assess the math. But to me it is another pair of ruined trousers.
What is the basis for our assumptions that people’s characters are unified, and that their behavior in one context will resemble their behavior in other contexts?