You’d already read about Philip Tetlock’s experiments on people asked to trade off a sacred value against a secular one, like a hospital administrator who has to choose between spending a million dollars on a liver to save a five-year-old, and spending the million dollars to buy other hospital equipment or pay physician salaries. And the subjects in the experiment became indignant and wanted to punish the hospital administrator for even thinking about the choice. Do you remember reading about that, Harry Potter? Do you remember thinking how very stupid that was, since if hospital equipment and doctor salaries didn’t also save lives, there would be no point in having hospitals or doctors? Should the hospital administrator have paid a billion pounds for that liver, even if it meant the hospital going bankrupt the next day?
“Shut up!” the boy whispered.
Every time you spend money in order to save a life with some probability, you establish a lower bound on the monetary value of a life. Every time you refuse to spend money to save a life with some probability, you establish an upper bound on the monetary value of life. If your upper bounds and lower bounds are inconsistent, it means you could move money from one place to another, and save more lives at the same cost. So if you want to use a bounded amount of money to save as many lives as possible, your choices must be consistent with some monetary value assigned to a human life; if not then you could reshuffle the same money and do better. How very sad, how very hollow the indignation, of those who refuse to say that money and life can ever be compared, when all they’re doing is forbidding the strategy that saves the most people, for the sake of pretentious moral grandstanding…
You knew that, and you still said what you did to Dumbledore.
You deliberately tried to hurt Dumbledore’s feelings.
He’s never tried to hurt you, Harry Potter, not once.
Harry’s head dropped into his hands.
Why had Harry said what he’d said, to a sad old ancient wizard who’d fought hard and endured more than anyone should ever have to endure? Even if the old wizard was wrong, did he deserve to be hurt for it, after all that had happened to him? Why was there a part of him that seemed to get angry at the old wizard beyond reason, lashing out at him harder than Harry had ever hit anyone, without thought of moderation once the rage had been raised, only to quiet as soon as Harry left his presence?
Is it because you know Dumbledore won’t fight back? That no matter what you say to him, however unfair, he’ll never use his own power against you, he’ll never treat you the way you treat him? Is this the way you treat people when you know they won’t hit back? James Potter’s bullying genes, manifesting at last?
Harry closed his eyes.
Like the Sorting Hat speaking inside his head –
What is the real reason for your anger?
What do you fear?
A whirlwind of images seemed to flash through Harry’s mind, then, the past Dumbledore weeping into his hands; the present form of the old wizard, standing tall and terrible; a vision of Hermione screaming in her chains, in the metal chair, as Harry abandoned her to the Dementors; and an imagination of a woman with long white hair (had she looked like her husband?) falling amid the flames of her bedroom, as a wand was held upon her and orange light reflected from half-moon glasses.
Albus Dumbledore had seemed to think that Harry would be better at that sort of thing than him.
And Harry knew that he probably would be. He knew the math, after all.
But it was understood, somehow it was understood, that utilitarian ethicists didn’t actually rob banks so they could give the money to the poor. The end result of throwing away all ethical constraint wouldn’t actually be sunshine and roses and happiness for all. The prescription of consequentialism was to take the action that led to the best net consequences, not actions that had one positive consequence and wrecked everything else along the way. Expected utility maximizers were allowed to take common sense into account, when they were calculating their expectations.
Somehow Harry had understood that, even before anyone else had warned him he’d understood. Before he’d read about Vladimir Lenin or the history of the French Revolution, he’d known. It might have been his earliest science fiction books warning him about people with good intentions, or maybe Harry had just seen the logic for himself. Somehow he’d known from the very beginning, that if he stepped outside his ethics whenever there was a reason, the end result wouldn’t be good.
A final image came to him, then: Lily Potter standing in front of her baby’s crib and measuring the intervals between outcomes: the final outcome if she stayed and tried to curse her enemy (dead Lily, dead Harry), the final outcome if she walked away (live Lily, dead Harry), weighing the expected utilities, and making the only sensible choice.
She would’ve been Harry’s mother if she had.
“But human beings can’t live like that,” the boy’s lips whispered to the empty classroom. “Human bei