Wonkmetrics, 3-31 to 4-6

Ezra Klein lists top op-eds in each morning’s Wonkbook. I’m collecting the authors and sources. I’ll be doing this weekly and keeping running totals. Here’s this week’s, now with prettier formatting!

Click the first chart to enlarge.




Also, here’s a link to running totals with authors and sources in different sheets, which are too big to display conveniently.

No notes this week because the temptation to generalize after a small sample is too overwhelming. TOO OVERWHELMING


Wonkmetrics, 3-23 to 3-29

Ezra Klein lists top op-eds in each morning’s Wonkbook. I’m collecting the authors and sources. I’ll be doing this weekly and keeping running totals. Here’s this week’s:

(Click the second chart to enlarge)




Krugman and Brooks are probably the most well-known liberal and conservative bloggers out there, so it may be interesting to see going forward whether they’re linked to more often than other bloggers/opinionators. Soltas has the only other two, and he helps assemble Wonkbook, so that’s interesting. Also, apparently if you just read NYT and Bloomberg op-eds each morning, you’ll have a pretty good grasp of the best op-eds of the week, or at least what you’ll say were the best op-eds of the week if you want to impress Ezra Klein at parties.

Next week’s will add running totals. Maybe after a few weeks I’ll make a publicly visible gdoc so people can access this very important project.

PS – Dibs, dudes.


Three Laws Safe

This is an Economist article about the ethical obligations robotics programmers have.

There’s a lot embedded in it.

Starting at the beginning, the article refers to HAL’s “problem” in 2001: A Space Odyssey; to fulfill his obligations to keep the mission secret from the humans onboard and fulfill the mission, he decides to kill them. The blogger claims this shows that “Society needs to find ways to ensure that they are better equipped to make moral judgments than HAL was.” The HAL comparison and the later argument that robots may need to be able to act on better than pre-defined rules argue in favor of an heuristic approach, but there’s a good reason to doubt the value of heuristics: we’ve been trying heuristics (kind of, at least if you believe that even the pure reason Enlightenment philosophers still took The World as an experience input before they started reasoning) for centuries, and we don’t know what ethics to teach our robots.

Referring to HAL’s decision as a “problem” in his programming is unfair though (and makes any balance the writer shows later in the article moot). It’s clear that the writer believes that robots should have the same implicit standards — killing is bad, theft is bad, willful deception is bad — that we do, while one of the biggest advantages of machine intelligence could be its ability to think either without any intuitive biases or with a different set of intuitive biases. Oh, critics of philosophical thinking can never be outside of the philosophical system they hope to examine? That wouldn’t need to be true if we weren’t so obsessive about making sure robots thought like we do. The issue is of course that we’d teach robots sets of biases that, even if different, would take our own as reference points, so its alternative biases would be heavily filtered through our metabias (which I guess already means something in statistics). We could solve that problem with diverse programming groups in terms of occupation and ethical background and combinatorial ethical programming of thousands of robots.

The above assumes that we even ought to teach robots ethics. While teaching a robot that, to use another example, there are ethical complexities involved in destroying a house with three terrorists inside and one civilian (or any non-zero quantities of terrorists and civilians) may be difficult, it’s less unreasonable to expect a robot to understand political complexities. A robot may be better able to answer the question “Should we stop using robots that randomly kill children?” with something better than a snarky “Accidentally bombing children with our super army of automated missile firing bots may have worked at the start of the war, but conditions on the ground have changed.” With enough complexity programmed into its rules — a reasonable goal however amorphous “enough” may be, given the improvements in processing power, memory density, etc. since 2001: ASO — drones and other combat robots could equally likely determine that particular attacks aren’t necessary.

Let’s go back to the three terrorists and a civilian inside a house example. If, for example, the long-run security or strategic backlash of killing the civilian outweighed the risk of attack from those three terrorists, the robot would not attack. Better still, robots don’t condition their risk estimations in an environment of fear; with realistic probabilities programmed in, instead of the sweeping fear that every terrorist will blow up the White House tomorrow if we don’t kill him today, the drones would decide that most attacks just aren’t worth it.

This advantage would show up in two places. First, drones with clear decision rules would be better able to react to an enemy’s being away from civilians. Rather than “minimizing” civilian casualties, an intelligent robot could eliminate them. If it takes a certain consistent amount of time for a high-ranking enemy’s vulnerability to be communicated to those who can authorize an attack, enemies can plan travel between inhabited areas that takes no longer than that window. That strategy would leave shot-callers with a continuous choice between killing civilians and passing on the shot, and considering the surge in drone strikes during the Obama administration (it looks like we got the tough on terror president after all), passing on the shot is unlikely. It’s easy, then, with the human element, for enemies forcibly to erode whatever ethical mandate the US may have had at the beginning of an operation.

The second place the advantage shows up is the avoidance of stupid battles. Nevermind who fired first in Fallujah, the claim that drones/robot combatants “would not commit rape, burn down a village in anger or become erratic decision-makers amid the stress of combat” is important. A drone with decision rules would not have opened fire on a crowd of civilians in Fallujah because, as far as I can tell, there was nothing important in Fallujah. Then, with nothing important to defend, that same drone would not have entrenched and fought an eight month battle. During the eight month battle that the drone wouldn’t have fought, it also wouldn’t have tied up 21 Iraqis, blindfolded them, cut off their legs, executed them, and thrown them in a mass grave (again, if that is what occurred, but even if it is not, think of it as a stand-in for other atrocities).

I’m not thrilled that we’ve killed an important someone else by drone strike, especially because I feel like we kill Al Qaeda’s number two in Afghanistan once every three months. But if we’re going to continue to “fight terror” in Afghanistan, I guess I prefer doing so with five foot Hellfire missiles to stationing tens of thousands of trained killers from a different culture on the ground and asking them to please only shoot at bad guys.

The flip side of all of this, of course, is that robots with risk calculations for everyone in the world and who kill subjects whose risk to some stable order or particular security interest exceeds acceptable levels would be the most terrifying thing that has ever been. We’d basically set ourselves up for this:

But that doesn’t mean we can’t hope for this:

The blogger makes two (boring) recommendations for improving robot ethics. First, laws about accountability, programmers, etc., etc. Ok, we get it, accountability is big right now. Why we would introduce accountability in this ethical grey area when it’s noticeably absent in so many others (private equity, for instance, or US fopo) remains a mystery, but whatever, accountability is fine. Second, “where ethical systems are embedded into robots, the judgments they make need to be ones that seem right to most people,” you know, because usually when people agree on a thing it’s ethically and justified instead of just democratically justified. For apparently being into high-minded ethical rules, the blogger took the easy way out on that one.

What’s exciting about the blog post isn’t particular points though as much as the reality that we live in a world in which it’s important to think about what kind of ethics we teach our robots. Imagine an autonomous one of these following Aristotle around. Admittedly, it would probably have to follow around Žižek or Sandel today and would thus only learn how to be exceptionally angry or impressed with its own idea of distributive justice, but that would just be the beginning. Teach two autonomous robots Rawls and Sandel, two Badiou and Žižek, and two Dworkin and Nozick, leave them in a room, write down the synthesis they produce, and (go away, Gödel) suddenly, the question of what the result of such an attempted reconciliation of beliefs would be becomes answerable. I won’t say “knowable” because there are some obvious biases towards particular types of knowledge present, but even answerable would be a great leap forward.

Sober Second Thought

I was shelving books (kind of) and started at the beginning of the alphabet. It’s not really a fair trick to play. I started with Albee and Atwood and Austen and Bronte and Dickens and Dostoevsky and Faulkner and Hemingway and Hesse and Huxley and thought man, writing a novel, right? That is the way to make yourself known as a thinker. I should write a novel. It is, in fact, embarrassing that I haven’t written a novel, or even anything that I can later in life turn into a modern-day Of Human Bondage. Then I reached the end of the alphabet. Jennifer Weiner is at the end of the alphabet.

I’d wanted to participate in a tradition in which the author of such gems as Good in Bed and Then Came You is a modern exemplar.

There are, what, two kinds of novels? “Serious” and “pop?” Forget it, I’m not interested anymore. If I write a pop novel, I have to convince 17 million 14 year-old girls that it’s good so that they’ll convince their older sisters it’s good so that they’ll convince their boyfriends to see it in theatres when Paramount decides it simply must have the rights so I can release a new version of the book within two years with no significant changes except that the reader, previously burdened with the task of imagining the images described in the text, can now rely on the dramatically lit actors/actresses on the front cover. I must seduce millions of tweens with vapid language and transparent re-imaginings of old stories disguised as new stories. The alternative is the seduction of millions of middle-aged women by writing bondage into tired romance narratives and throwing a pair of handcuffs in monochrome on the cover. Or vampires. There are always vampires. There’s a thriving category of lit right now that could be summed up “holy hell, women like sex and have personalities?”

Serious novels are written to be hard. Johnathan Franzen nails the “serious novels” crowd’s rationale:

difficulty tends to signal excellence; it suggests that the novel’s author has disdained cheap compromise and stayed true to an artistic vision. Easy fiction has little value, the argument goes. Pleasure that demands hard work, the slow penetration of mystery, the outlasting of lesser readers, is the pleasure most worth having.

But then, having read the serious novel, the crowd arrives at an Idea, or perhaps at a Non-Idea, or perhaps at an antithesis through which they’re supposed to work to become Better People (or at least Better Read People) who can express the Idea/Non-Idea in a few sentences that will Change Their (Well Read) Friends’ Lives. The obvious question is why not just write that idea?

Here’s one idea: hard work and the pursuit of excellence do not guarantee fulfillment. Here’s another: personal doubts can undermine satisfaction with publicly appreciated work. Here’s a third: ethics is malleable, and quickly pronouncing judgment on a system too complicated to understand quickly will often result in error. You want 500 plus pages of these? Here’s a method: let’s start with a child so the main character can lose, regain, and lose again his innocence, find himself, and Learn an Important Lesson (looking at you, Philip Carey). Please.

Poets know what they’re doing. I don’t mean like T.S. Eliot. I don’t mean people who need forty pages of notes and six languages to explain that they’re smarter than you are. I don’t really know who I mean. Maybe Keats. Maybe Renee Gladman.

I’m not sure where this leaves me. I still prefer harder books. Life of Pi and its ilk aren’t going to become satisfying suddenly just because I’ve survived an hour of believing that long fiction may not be worthwhile to aspire to writing, but it’s not like I can deliberately switch to being open-minded about pop books. While I’m not convinced that long, hard fiction is inherently good for lit, I’m still less convinced that pop/deliberately easy fiction isn’t inherently bad for it.

My new opinion about books is that I’d like to open a bar called “Artless Bastards.” It will be a lending library, except instead of filling out a form or keeping up with records in a computer, the condition for borrowing a book will be leaving one in its place. That rule should solve (mitigate) assurance of quality problems and art oligopoly problems.


I resumed reading Roberto Bolaño’s The Secret of Evil, a collection of his unfinished work (unfinished work is the way to go), and just found this:

When it comes to [Osvaldo] Soriano, you have to have a brain full of fecal matter to see him as someone around whom a literary movement can be built. I don’t mean he’s bad. As I’ve said: he’s good, he’s fun, he’s essentially an author of crime novels or something vaguely like crime novels, whose main virtue — praised at length by the always perceptive Spanish critical establishment — is his sparing use of adjectives, a restraint lost, in any case, after his fourth or fifth book. Hardly the basis for a school. Apart from Soriano’s kindness and generosity, which are said to be great, I suspect that his sway is due to sales, to his accessibility, his mass readership, although to speak of a mass readership when we’re really talking about twenty thousand people is clearly an exaggeration. What Argentine writers have learned from Soriano is that they, too, can make money. No need to write original boks, like Cortázar or Bioy, or total novels, like Cortázar or Marechal, or perfect stories, like Cortázar or Bioy, and no need, especially, to squander your time and health in a lousy library when you’re never going to win a Nobel Prize anyway. All you have to do is write like Soriano. A little bit of humor, lots of Buenos Aires solidarity and camaraderie, a dash of tango, a worn-out boxer or two, an old but solid Marlowe. But, sobbing, I ask myself on my knees, solid where? Solid in heaven, solid in the toilet of your literary agent? What kind of nobody are you, anyway? You have an agent? And an Argentine agent, no less? (69-70)

At least, apparently, if I have doubts about literature and its production, I have company in Bolaño. That’s comforting, in a way.

Shock me, shock me, shock me with that deviant behavior

I’m officially joining the stand against Elsevier by vowing not to contribute any math essays to their journals. I’m not qualified to contribute math essays to any journals, but my unqualified abstention will start with them.

Tim Gowers, a mathematician at Cambridge, started something by accident which turned into general internet outrage (see the comments on the previous link) which turned into a petition (because those work) which turned into a response letter from Elsevier which turned, I don’t know, back into a pumpkin at midnight.

Gowers is frustrated that Elsevier charges a lot of money for access to journals, specifically journals like Chaos, Solitons and Fractals that he and other mathematicians laugh about. While the usual answer to someone complaining about expensive things they don’t want would be “don’t buy it, stupid,” Gowers alleges that, due to Elsevier’s all or nothing journal bundles, libraries and institutions don’t have a choice. That sounds pretty awful.

Elsevier contends that they have “actively and progressively promoted a wide range of access options, which are important since no one model will ever be the only solution for every type of journal” and that when institutions do bundle, “they get substantial volume discounts that offer more titles at a lower cost. And the additional titles they subscribe to are used by their researchers. In fact, on average approximately 40% of researchers’ usage is of journal titles that the library previously had not subscribed to.”

So, you know, this is nice. The little guy, this poor Cambridge mathematician, is miffed that someone would use institutional power prestige to restrict access to academia. They price out other publishers and decrease options for libraries with their bundling, while he prefers a more open, broader discourse where libraries don’t have to purchase Chaos, Solitons, and Fractals if they don’t want to.

I was ready to feel righteously indignant when I started reading Gowers’s post, but at this point I feel like a lot of his rage is rooted in libraries having to purchase journals he thinks are stupid. C, S, and F did have a problem where one of its editors was using the journal to publish hundreds of his own papers without proper review, but that point, which is completely fair as a reason not to respect the journal, somehow evades mention in Gowers’s comment that the journal “is regarded as a joke.”

The big question here years ago would have been whether research should be anarchic or should have standards. The anarchic model, at the time Feyerabend wrote, would have been nearly impossible. Journals were an important way for academics to keep up with the general trends of their fields without having to receive forty-three heavy-duty envelopes every month from far-flung places and incur thousands in postage costs. Now, though, we have the internet and blogs. We can and should move as much of academic discourse as possible to openly accessible fora.

We already, by the way, have a peer review process in the blog/comment/counter-blog discourse model. When Paul Krugman blogs, that anyone who reads his posts can comment on them doesn’t cheapen the dialogue because Krugman isn’t obligated to reply to every comment. Instead, when one particularly piques his interest (or, more likely, presents an opportunity for him to tell the rest of the world how poor was the commenter’s understanding of economics), he writes a new blog post taking on the assertions made in the comments.

Krugman also dives into debates with other bloggers; this CNBC piece covers the MMT Blog War pretty well. The open framework of the internet didn’t lead to Paul Krugman’s being so overwhelmed with nutjobs that he couldn’t have a serious academic discussion. Instead, the two sides shot back and forth with an immediacy that is impossible in refereed journals.

Feyerabend suggests that “Confusionists and superficial intellectuals move ahead while the ‘deep’ thinkers descend into the darker regions of the status quo or, to express it in a different way, they remain stuck in the mud.” Truly open discourse in which “confusionists and superficial intellectuals” have the opportunity to approach mainstream academic problems from odd angles and expect replies when they get something… not “right,” but interesting should keep the deep thinkers from remaining stuck in the mud; the deep thinkers’ replies, meanwhile, should alleviate some confusionism and superficiality.

We should reimagine accessibility in academia. At least eighty percent of the world can read; let’s give them access to something better than The Hunger Games and The Kite Runner.

In Which J. Edgar Hoover Hosts a Dinner Party, Scene I

[A knock at the door. J. Edgar Hoover, played by Stephen Fry or at least someone who looks and acts
like Stephen Fry setting the table. It’s also allowable that someone whose mannerisms and appearance
are in no way similar to Stephen Fry’s could play this role. Whatever he looks like, J. Edgar is dressed
in a green jumpsuit with the hammer and sickle on the left breast. Everyone else, with the exception of
Jackie Kennedy and the chef, will be dressed similarly. Jackie gets to wear a dress. The chef should
meet the description he is given on his entrance The table setting is ORNATE, of course. J. Edgar keeps
switching two place settings back and forth until Dwight D. Eisenhower runs around the table. The two
he is switching are his own and Joseph McCarthy’s. The place settings for Jackie and John Kennedy
face away from the audience.]

J. Edgar: Come in!

Dwight D. Eisenhower [from behind the door]: How do you know who it is?

J. Edgar: The same way I know everything, just come in!

[Eisenhower enters]

Eisenhower: Sorry I’m early, you know how I am about these things.

J. Edgar [looking frustrated with the two place settings]: Preemptive attendance, first strike capabilities,
yes, I’ve heard all about it.

Eisenhower: What’s the matter with that plate?

J. Edgar: The plate?

Eisenhower: Yes, the plate.

J. Edgar: Which?

Eisenhower: The one you keep fiddling with!

J. Edgar: I just, oh it’s nothing, don’t worry about it.

Eisenhower: Whom did you invite this evening?

J. Edgar: Sacco and Vanzetti.

Eisenhower: [laughs] No, but really.

J. Edgar: Sacco and Vanzetti and John Wilkes Booth.

Eisenhower: Cut it out, you know I’m sensitive about that kind of thing.

J. Edgar: [laughs] Not at all. You can read though, the names are all on the place cards, can’t you tell?
Eisenhower: And if I switch these two?

J. Edgar: Don’t.

[Eisenhower picks them up and starts prancing around the table like a drum major]

J. Edgar: Dwight please, you’ll shake the microphones right out!

Eisenhower: Of course. I’m terribly sorry.

J. Edgar: It’s alright, just put them back where they were.

[Eisenhower replaces the place cards]

Eisenhower: John coming tonight?

J. Edgar: Yes, tragically.

Eisenhower: You ever worry he’s a commie?

J. Edgar: Dwight, I worry everyone is a commie.

[The phone rings. J. Edgar picks it up]

J. Edgar: Hello this is J. Edgar, may I say who’s calling?

Eisenhower: I’ll just run to the little boys’ room. [Exits opposite where he entered]

J. Edgar: Oh Jackie it’s just too droll that you should call right now.

Jackie: [from offstage] Oh really? Why’s that?

J. Edgar: Well wouldn’t you know, I just got the surveillance set up on your house moments ago, and
the feed’s coming through great.

Jackie: Oh J. Edgar, you didn’t!

J. Edgar: I did!

Jackie: Well, J. Edgar, you’re always up to those tricks of yours. Anyway John and I are on our way.
He’s out in the limousine now. Says we should leave early so that if anyone knew where we
were going they’d get there too late. I just don’t know where he gets these ideas about bad things
happening to people in limos!

J. Edgar: I know, dear. He really is just so concerned all the time about these ideas of his, and civil
rights, and I just don’t know how you put up with it.

Jackie: Well he has a kind soul.

[Eisenhower returns, again from the opposite side]

Eisenhower: Would you believe, I went to the little boys’ room and there was actually a little boy in

Jackie: What was that?

J. Edgar: Nothing dear.

Jackie: Was that Dwight D.? You didn’t tell us you invited him! He’s such a dear. We’ll be right there.

[She hangs up]

J. Edgar: What’s that you said? A little boy in the bathroom?

Eisenhower: Well yes. It was very odd. What’s he doing there?

J. Edgar: I certainly have no idea. You know if it’s not the communists, it’s the queers. I’m sure one of
those groups must be to blame.

[A knock on the door again]

J. Edgar: Come in!

McCarthy: Well hello, J. Edgar. Nice of you to invite me.

J. Edgar: Joseph, Joseph, you always sound so surprised to have been invited to social events.

McCarthy: [ignoring him] Dwight, good to see you.

Eisenhower: Joseph.

J. Edgar: Would either of you like to hear the schedule for the evening?

Eisenhower: Say that’d be swell.

McCarthy: Sure, J.

J. Edgar: Well, we’re still waiting on the Kennedys, but once they arrive I thought we’d start with
appetizers. I made sure that all of the food was from non-Communist countries, Joseph, so we
won’t have another incident like that other time.

McCarthy: Thanks, J. That was thoughtful of you.

J. Edgar: Anyway then I thought to fill the time between appetizers and dinner we could interrogate
another suspected communist I found.

McCarthy: Who’s this one?

J. Edgar: He was a grocery boy with whom Julius and Ethel Rosenberg occasionally communicated. I
think some of their notes may have been a code.

McCarthy: Can I see ’em?

J. Edgar: Yeah, sure. [He pulls some sheets of paper out of his pocket and hands them to McCarthy]

McCarthy: These are dated even! Look, several weeks in a row: “Leeks, onions, celery” in all of ’em.
You think that means something?

Eisenhower: It is not by deciding whether it means something that we stay prepared, but by staying
prepared that we decide whether it means something.

McCarthy: By Jingo, Dwight, that’s a mouthful.

Eisenhower: And we, Joseph, are prepared.

McCarthy: Ok, so we’re prepared, so does it mean something?

Eisenhower: I don’t honestly know.

J. Edgar: I haven’t told you the best part.

McCarthy and Eisenhower, more or less in unison: Yeah? What is it? [and improvise as many other
enthusiastic or otherwise intrigued expressions they can invent on the spot]

J. Edgar: Would you like to guess how long I’ve held him here?

McCarthy: A year.

Eisenhower: Three years.

J. Edgar: He was born March 22, 1939.

McCarthy: No.

Eisenhower: I can’t believe it.

J. Edgar: and I detained him on March 22, 1938.

McCarthy: No but I really can’t believe that.

Eisenhower: That has to break some kind of physical law.

J. Edgar: Joseph can tell you all about how laws change with interpretation.

McCarthy: Or, really, if you haven’t read them.

[J. Edgar cracks up]

McCarthy: What?

J. Edgar: I just had the funniest image of a coyote running through the air over a canyon, held up only
by his ignorance of gravity.

Eisenhower: What could he have done before he was born though?

J. Edgar: The coyote? Probably nothing.

Eisenhower: The detainee.

J. Edgar: Right. Well I had it on good authority that he would have been a Communist, had he been
born into freedom, so I detained him to celebrate my fourth anniversary in office.

Eisenhower: But you detained him on your third –

McCarthy: He wasn’t even born on your third anniversary in office.

J. Edgar: Gentlemen, I do not take the duties and privileges of my position lightly.

McCarthy: I’ve never been so impressed with you.

Eisenhower: What’s for dinner though?

J. Edgar: I’m sorry?

Eisenhower: You said we’d interrogate him between the appetizers and dinner. What’s for dinner?

J. Edgar: Oh right. Well my chef [he gestures toward the door into which Eisenhower disappeared to
use the restroom and a shirtless, bronzed, and ripped man emerges and walks across the stage
smiling and waving, exiting on the opposite side. While he is still on the stage, no one else
moves] is making his specialty.

Eisenhower: Which is?

J. Edgar: It’s a state secret! [All laugh] No really it’s some Chinese chicken dish.

Eisenhower: Oh good, I simply cannot stomach beef after a good interrogation.

J. Edgar: From before all of this Mao Zedong bullshit, too.

McCarthy: I was worried.

J. Edgar: I assure you, the recipe dates from the opium days.

McCarthy: I loved the opium days.

[The Kennedys enter without knocking. John takes a moment to beam at the audience before following
Jackie upstage]

Jackie: [teasingly] Did I just hear Joseph McCarthy say he loves opium?

McCarthy: Now Jackie, I know you won’t let that leave this room!

Jackie: Because you scare the shit out of me! Really Joe, how are you?

McCarthy: Good, good. John? How are you?

Eisenhower: Rested I hope. As long as you two took to get here, we thought someone had been

Jackie: You would not believe the traffic on the highway. Whose idea was this highway system if we
can’t get anywhere on it anyway?

John: Jackie, come on now, shouldn’t we be more diplomatic? He’s the president, after all, he has
to make a lot of hard choices.

Eisenhower: It’s true, John. You know, if you play your cards right, you could be president someday

John: You sound like my father.

McCarthy: Anyway, now that we’re all here, should we start with the appetizers?

J. Edgar: I should tell you all, one of the plates has a camera in it, but I won’t tell you which one.

McCarthy: Only one?

Eisenhower: J. Edgar, how are we supposed to feel safe if you’re only monitoring the room with one

J. Edgar: Oh it’s not so bad as all that. There are also the other cameras there, there, and there [he points
to each as he says them. He doesn’t need to point anywhere specific.] I also have an armed
security detail in the audience. There is one sitting in every seat whose number can be expressed
as the product of three and a prime number.

Jackie: Oh well that’s alright then.

Eisenhower: In every seat like that?

J. Edgar: Well, no, not every seat, just some of them, so [to the audience now] there’s no reason to be too
suspicious of your neighbor.

[They all grab seats at random, except John. No one ends up in front of his or her place card. The
empty seat is directly across from J. Edgar Hoover]

Jackie: What’s the matter John?

John: Well, J. Edgar didn’t take the only remaining seat.

J. Edgar: That’s true, I took the one I’m in.

John: So I know he’s not concerned about his seat having the camera.

Eisenhower: Right, so?

John: So doesn’t that make it more likely that the remaining seat has the camera?

Jackie: J. Edgar probably didn’t choose his seat sinister…ly. He probably just grabbed the closest seat.

John: But why is the open seat the furthest from him?

McCarthy: Come on, John, I’ll trade with you if you like.

J. Edgar: No!

[Pause, confusion]

J. Edgar: I just mean that we can’t let these paranoias get the best of us. John, let’s be realistic. Even if
that plate is the one with the camera, it’s not like you’ll be much more under surveillance than
you already are. It’s important, I think, when overreacting to stupid civil liberties concerns, to
remember that if it’s a battle you think you can fight, you’ve probably already lost.

McCarthy: That’s true, John. Surveillance is the price we pay for safety from the Soviet machine.

Jackie: Yeah, John! Even if that is the plate with the camera in it, it’s not like you’re being duped. You’re
being a patriot!

McCarthy: For America and dinner then, won’t you sit down?

John: It’s just so suspicious that all of you would dash instinctively for the chairs not directly opposite
J. Edgar.

Jackie: Nobody dashed, John. Sometimes people end up places, and that’s what we did.

John: Oh I suppose that’s alright, I just wish I weren’t so conscious of being monitored.

J. Edgar: John, that’s why I hid the camera in a plate.

[John sits down]

Eisenhower: Now that that’s settled, you two will not believe the surprise J. Edgar has for us as an
amuse bouche.