05/02/2011 - 8:08am

Commentators immediately began asking whether Bin Laden would be made into a martyr by Al Queda. I suspected that they were using the word rather loosely (the English language suffers considerably when new anchors must speak extemporaneously) and so pulled out my Collins dictionary, which has served me well over the years. It gives five definitions (the first three as a noun, the last two as a verb) of martyr, which, if you’ll indulge me, I’ll list in order: 1) a person who suffers death rather than renounce his religious beliefs; 2) a person who suffers greatly or dies for a cause, belief, etc; 3) a person who suffers from poor health, misfortune, etc; 4) to kill as a martyr; 5) to make a martyr of.

The first three definitions do not apply to Bin Laden. He fought to the end; he was not defending his faith, but a willful misinterpretation of it. He was, so far as I know, not a martyr to any particular disease. We look, then, to martyr’s use as a verb. Will Bin Laden’s followers make him a martyr? Is it even open to them to do so when Bin Laden does not meet the basic criteria of martyrdom? One either is or is not at the moment of death a martyr. It’s only when, as in Bin Laden’s case, one’s martyrdom cannot immediately be proven that it becomes necessary to “make a martyr” of someone. During the inquisition, heretics who died for their faith became martyrs the moment they died; they did not need those they left behind to confer martyrdom on them. They conferred it, as it were, on themselves.

Now, Bin Laden’s followers probably think their leader achieved martyrdom at the moment of his death, and that their conferring of martyrdom on him is therefore merely a formality. Thus, for them, there is no inconsistency. This, of course, is as it should be. The word is pliant enough to admit of its own violation.

04/29/2011 - 10:53am

One of the major works in the Tea Party canon, Ayn Rand’s Atlas Shrugged, is now a movie. Rand’s work has enjoyed a resurgence over the past few years, especially among libertarians, who see her as a visionary. Rand first came to my notice as a teenager when I read The Fountainhead over the course of two days. Her ideas are naturally appealing to anyone resentful of authority or convention, and what teenager—or for that matter adult—is not, at some point, resentful of both? I was, at fourteen, an easy convert to her ideas, and readily identified with Howard Roark, the novel’s hero, while I verily believed I hated his principal enemies, Peter Keating and Ellesworth Toohey.

There is, in Rand, no ambiguity. She is not an artist in any sense of the word. I did not learn this until later, of course—at fourteen, I had not read much, and so had little to compare her work to. The only characters who have misgivings in her work are the ones on the wrong side, that is the defenders of tradition and convention. The characters of the right side—the trailblazers such as Roark—know they are on the right side, and therefore introspection becomes, in their case, superfluous. The result of her methodology is that only the antagonists of her novels have any depth; the protagonists are offensively self-righteous.

I’m not in the habit of making book recommendations, but I have no qualms about telling people what not to read. Don’t read Rand. All you are left with at the end of her novels is that selfishness is good, altruism bad. Her characters, like Dickens’, fit into molds, but whereas Dickens has thousands of molds, she has but two: one is either for capitalism, or against it. Interestingly, I don’t think there has ever been a brilliant novel written in praise of capitalism; capitalism is a fine thing, but great novelists invariably attack it. Think of Dickens, Zola, Tolstoy, Balzac, Dreiser—the list goes on of great novelists unsympathetic to capitalism.

04/28/2011 - 8:48am

It’s an increasingly common sight: wealthy women carrying their dogs about America’s malls. As with so much that is wrong with America, I blame Paris Hilton. The dogs they carry are, of course, always small, small enough to fit into their Coach and Burberry bags. I have noticed that their owners usually wear oversized sunglasses, though I have seen a few who have courageously dispensed with this part of the ensemble.

These pooches, being in such close proximity to their masters, are becoming, an article in the New York Times tells us, as insufferable as them. They bite, and indeed kill, with more frequency than their poorer brethren. (Well, I don’t know if that is true, but I want to believe it is). It makes sense anyway: if I was a dog carried about all day by a fifty year old women pretending, through Botox injections, to be thirty, I would want, when released from her dubious protection, to make my presence felt. Certainly, killing another dog would qualify, whereas making myself amenable would be altogether too expected and predictable. No one, after all, even tiny dogs, wants to be treated as an appendage, but this is precisely how many socialites treat them. (Notably I’ve yet to see a truly striking women burdened by this modern day cargo)

I don’t mean to excuse the bad behavior of all wealthy dogs; I want, however, to offer a special dispensation to those who commit crimes after spending an inordinate amount of time nuzzled against their owner’s breast. While the rest of the animal kingdom remains on terra firma or in the ocean, they must survey the world from a perch, cut off from communion with their peers. And what, we must ask, do they have a view of? Not much. They survey the outfits their owner may or may not buy. They go into dressing rooms with their owners, where presumably they see them naked (which may, or may not be a tantalizing sight). It’s a sad, dreadful, wearisome life. Dogs may be man’s best friend, but I wonder if the bored rich—a horrible combination—don’t insist too much on this relationship.

04/27/2011 - 8:27am

With Donald Trump questioning the President’s intelligence, I was not surprised to find Dana Milbank of “The Washington Post” dutifully extolling it in an op-ed piece this morning.  During Bush’s administration, the right more or less agreed with the left that George Bush was not one of our leading lights. The left, unwilling to make a similar concession regarding Obama, overcompensates by repeatedly calling him brilliant.

Wherein lies his brilliance? Well, no one really knows. Brilliance is not like beauty; it is not self-explanatory—one requires at least occasional demonstrations of it. These are notoriously wanting. His transcripts, as Mr. Trump reminds us, remain sealed. There are two books in the Obama canon (neither of which, I admit, I’ve read). There are also hundreds of speeches. Presumably Mr. Milbank could appeal to this body of evidence, contradictory as it may be. Does he? No. He says, rather vaguely, that there is a lot going on in the President’s head. We are, throughout the piece, treated to the opinions of academics who confirm this insight of Milbank’s. One of the professors Milbank consulted hailed Obama’s “integrative complexity,” which he had the goodness to define as the ability to balance competing claims. Another professor provisionally called Obama a “complex thinker,” though he had yet to apply his model to Obama.  A professor at Cornell said he was “rational,” no doubt a jibe at his “irrational” predecessor.  

Mr. Milbank, anticipating the skeptic who wonders what results these wonderful qualities of the President’s have yielded, revives the old saw about how intellectuals are unfit for politics. He wistfully writes, “In an ideal world, complex and rational thought would be virtues. But in politics, these attributes can make Obama seem ambiguous, without toughness or principles.” Obama, in other words, is too good for us; he is beyond our comprehension. If he condescends to us, it is because he has no choice, looking down, as he does, from such a height.

04/26/2011 - 10:16am

The recent tiff between Donald Trump and Robert De Niro was hardly as entertaining as the one Trump had with Rosie O’Donnell a few years ago. It was, moreover, entirely predictable, for whenever a person questions Trump, as De Niro did regarding the “Birther” issue, Trump reflexively questions his opponent’s intelligence, but in the most prosaic way. He said De Niro was not the brightest bulb on the planet, and then resorted to another cliché of the repertoire by saying Mr. De Niro was not Albert Einstein. (I’ve always preferred “Well, he’s not the coldest beer in the fridge,” or, “He’s a few cards short of a full deck.” I rarely make the mistake of telling someone he’s not Albert Einstein, knowing full well I’m not either).

Mr. Trump reached the determination that Mr. De Niro is stupid through watching interviews of the actor over the years. I’ve have not seen the interviews Mr. Trump refers to, and therefore am not in position to contest his assertion that they establish, beyond a doubt, Mr. De Niro’s stupidity. Given that most interviews of celebrities do tend to establish their stupidity, I’m even inclined to accept Mr. Trump’s estimation of Mr. De Niro’s intelligence. However, Mr. De Niro is not an ordinary actor. He is, indeed, a fine actor, one of the best we have.

Mr. Trump has inadvertently raised, then, one of the perennial questions about acting: do great actors need to be intelligent. (We know Mr. Trump’s answer). The theory that acting is an instinctual art was, however, debunked long ago; it is debunked every time a novice learns how difficult it is to perform the normal acts of life while being observed. Suddenly, smoking a cigarette becomes difficult. The great actor, moreover, must possess empathy in an even greater degree than the great writer, for he must bring to life a character made out of words. It is, when done well, one of the boldest and greatest acts of creation. One could hardly think a stupid person capable of such a feat.

04/25/2011 - 8:52am

I’ve heard a lot about James Franco, though I’ve yet to see any of his movies. I like people who have their hands in many things, and who has his hands in more things than Franco? He acts, he directs, he writes. I don’t know if he paints, but I suspect, just to make himself more incomprehensible, he probably does. The Hollywood Reporter says that Franco will now pursue a second PhD at The University in Houston in creative writing. There are generally two responses to Franco’s artistic restlessness: some say he is incurring too many obligations, while others laud him as a Renaissance man, who most audaciously (or heedlessly) takes all of art as his province.

I fall into the latter category, but with a qualification. Genius, in general, does not range freely over the whole human experience; it illuminates, very brightly, one or two aspects of it. There are, surely, examples of universal geniuses such as Voltaire, but somehow I cannot imagine a man who hosted the Oscars as the next great dispensary of wisdom and knowledge. And yet I do not want to be one of those who try to pigeonhole Franco, and demand that he focus his energy only where his talent is (my brief perusal of his short story collection tells me it’s not in writing, but then I’m generally intolerant of celebrities who take up their pens).

I wonder, too, if Franco is not rather too concerned with being considered an “artist” before he actually becomes one. Art has been known to make people awfully affected. His commitment (of course nobody but Franco knows the full extent of it) to art is clear, and it certainly distinguishes him from most of the ignoramuses that populate Hollywood. But art has millions of unsuccessful votaries. I sincerely hope Franco is not one of them.

04/23/2011 - 9:15am

I find it embarrassing to watch commercials; it’s one thing to waste one’s time watching television, it’s entirely another to waste it watching asinine Geico spots with the lizard or Progressive ones with “Flo.” (I’m embarrassed I can remember her name). DVR and TiVo were supposed to liberate us from unwanted advertisements, but, as a recent article in “The Atlantic” suggests, such liberation will be long in coming, for advertisers, knowing the myriad of ways in which we circumvent them, are intent on reaching us through a medium where we cannot give them the dodge, that is through content itself.

Like most people, I think myself immune to advertising. Of course, the fact that I prefer Ralph Lauren to some generic brand proves I’m not. Imagine, then, the power advertisers will have (and already have) when they no longer advertise explicitly, and thus take away my power to reject their message out of hand. They will take away, in other words, my ability to consent. It is easy to flip the channel when the commercials come on, but the day is soon coming when the media will address us seamlessly as consumers, with the old distinction between content and advertising obliterated.

What, in such a world, happens to popular art? Most movies and television shows have long ago given up the pretence that they are art, but a few still aspire to something above drivel.  One can easily imagine a brilliant director having to humor producers who want certain products placed at opportune moments of his film. Of course, the conflict between a director’s vision and a producer’s is nothing new, but certainly the necessity of “branding” will only put the director’s “artistic” vision and the producer’s “practical” one further out of alignment. It’s heartbreaking to consider what concessions upcoming directors will have to make. Consider also, the talented singer, who will, at the behest of corporate executives, cynically plug some product in one of his songs. It will be very, very difficult for artists to escape the taint of branding, and those that do will probably be your next generation of starving artists.  

04/22/2011 - 8:06am

I mentioned in yesterday’s blog that polls have now become stories in themselves. I did not have to wait long for confirmation of my theory: the Times again made the results of a poll one of its lead stories. The poll surveyed American’s outlook on their country’s future, and found that we are quite a pessimistic bunch who are concerned about gas prices, the debt, and unemployment. There was, amidst these revelations, one nugget: the number of Americans who think the economy is not improving increased by thirteen percent. Perhaps the Times will conduct another poll to determine what to attribute this sudden change to.

Other things the poll established: we hate Congress. Well, 75 percent of us do. By the way, I think if we were living in Athens during the Golden Age, we would still object to our leaders. We take pleasure, let us admit it, in despising our politicians, and as much as we clamor for virtuous leaders, we get rather bored by them. Given the opportunity, as respondents to polls are, to “approve” or “disapprove” of a particular politician or policy, most of us take Groucho Marx’s stance: “Whatever it is, I’m against it.”

It’s only natural that this gut reaction should determine our attitude towards politics, for poll after poll has established our ignorance of all things political. Consider how many people have yet to establish an opinion of Mr. Boehner (27 percent). In some respects, I find the unwillingness of someone to have an opinion about Mr. Boehner commendable. Who, after all, is Mr. Boehner, and why should I have an opinion about him, when he has never taken the trouble to form one about me? Simply because a person is brought to my attention does not mean I have to notice him. It’s not for nothing that many intellectuals of the 19th century looked on the newspaper as a vile rag, good only for propagating ignorance and trivia one can safely do without. Enlightened citizens do not digest newspapers whole; they read, especially in this Internet age, as discreetly as possible.

04/21/2011 - 1:53pm

Political polling used to be—or perhaps I’m just imagining things—confined to election years. Once it was discovered, however, that the results of a poll could themselves make a story rather than simply support one pollsters found themselves employed during the years between elections. And so this morning we have an article from the Times titled "Poll Finds Lack of Passion for Republican Candidates."  

These are, for the journalist who must write them, easy articles to whip up on a moment’s notice, following, as they do, an entirely predictable form.  It starts with the obligatory pronouncement about how the Iowa caucuses are a year away. This accomplished, there is the repetition of the conventional wisdom about why no Republican candidate has endeared him or herself to primary voters: the culprit, epiphany of epiphanies, is the late primary season.  Next, Mr. Rutenberg, the article’s author, gages the prospects of candidates based on their favorability ratings. We learn once again that American voters do not know Mr. Pawlenty. Nor do they know Mr. Barbour, or Ms. Bauchmann, Mr. Daniels, or Mr. Santorum. The moral of the story, I suppose, is that as long as they remain unknown we are unlikely to vote them. I concur.

There are four potential candidates the poll determined the American public did know: Mr. Romney, Ms. Palin, Mr. Huckabee, and Mr. Trump. We learn—once again—of Sarah Palin’s very high “negatives”:  55 percent of respondents had an unfavorable view of the Governor. One can only imagine, once Ms. Bauchmann makes her debut—I mean one not on Fox— the high unfavorability ratings awaiting her.

After the results of the poll are done with, Mr. Rutenberg takes care to issue the disclaimer that the unknown candidates might become known during the course of the race, though he offers us no solace by suggesting that some of the known candidates might return to being unknown.

04/20/2011 - 8:58am

If you have visited New York recently, you have probably seen letter grades (A through C) posted on the windows of the city's restaurants. These grades, given by the city's Health Department, inform prospective diners--and who in New York City, with all the aromas perpetually in the air, is not a prospective diner?--of a restaurant’s cleanliness as determined by people who have probably never worked in one. (Few bureaucrats are more dreaded than the health inspector, with his clipboard and flashlight trained on unreachable crevices). A restaurant is clean, of course, to the extent that roaches and mice—to name just two freeloaders—do not regularly patronize it.

I confess I do not see the point of grading a restaurant's cleanliness. The only thing I want to know about a restaurant is whether or not the food is good, and to answer this question, there is, between Yelp and Zagat, no shortage of opinion, if not consensus. The fact that a restaurant is open is enough for me; if it wasn't sanitary, the New York City Health Department would presumably shut it down. 

And yet now whenever I go to a restaurant in Boston, I find myself wanting assurances that the back of the house is as spotless as the front, a miracle that happens only at very slow restaurants where hours go by between orders. (But who wants to go to a slow restaurant?--it is slow, as we all know, for a reason). Without a grade to satisfy me on this point, I'm left to own conjectures, but what, with my view confined to the front of the house, can I conclude? Am I to ask the host for a tour of the kitchen before I decide whether I want any of the food that comes out of it?

My point, simply, is this: we, as diners (and consumers generally), have become insufferable. We want to know if the beef we are eating is grass fed. (The refusal to put this question in the past tense always amuses me). We also are curious about salmon: is it farmed raised, or wild? Are we eating sustainably? Are the greens on our plate organic, and if not, why not? It’s now all the rage to have a gluten allergy thanks to Elizabeth Hasselback and other celebrities. And so it goes. The foodie culture—I despise that word—has made critics of all of us, when, in truth, only a few are qualified to play that role. (For evidence, read a restaurant review from the Times and compare it to the average offering on Yelp, which, I admit, I rely on in a pinch). This is not to say diners do not have rights; indeed they do, but only up until a point. The more expensive a restaurant the more rights one has; at McDonald’s one practically forfeits them. Is this anti-democratic? Indeed, it is.

Syndicate content