[Web-Dorado_Zoom] [print_link]
November 28, 2020 | Rome, Italy

I

By | 2018-03-21T18:50:38+01:00 August 12th, 2012|Area 51|
Muhammad Ali: Purveyor of a brazen "I," which stuck.
I

n a recent book review published by the New Yorker magazine, the critic James Wood cites an essay by Walter Benjamin on the significance of storytelling, quotes from it, and finally uses a three-word gambit — “I sometimes wonder…” — to introduce his essay.

American journalism has a long and intricate relationship with the first person, the assertive norm for most of the 19th century and well into the 20th. Journalists, essayists and critics were presenters and narrators, first person bias adding verve and tint. The first person was the video of its time, portraying situations the mind’s eye was encouraged to embellish. Brilliant Baltimore Sun muckraker H.L. Mencken was the celebrity in his own stories: “On a winter day some years ago, coming out of Pittsburgh on one of the expresses of the Pennsylvania Railroad, I rolled eastward for an hour through the coal and steel towns of Westmoreland county…” — with the reader along for his ride.

Time Magazine, founded in 1923, took a different route. While its early prose remained purple, the first person was nixed, replaced by unsigned reports intended to suggest authority and omniscience. For publisher Henry Luce, the magazine’s overall position (a conservative one) mattered more than celebrity presentation, namely authorship. Time‘s editors associated bylines with a kind of personal, and therefore emotional, ownership and that risked undermining a disengaged view unfolding events. This, ironically, was the beginning of objectivity, labeled by some as collective bias, since the extinguishing of the “I” and the elimination of authorship hardly guaranteed disinterest.

After World War II, whose details (death camps and atomic bombs) overwhelmed available adjectives, the mood changed. Newspapers and magazines entered a sober phase in which first-person narrators were openly discouraged. The “I” was allowed in reports of carnage actually witnessed by reporters, but only if balanced with a version of events that included rival versions of what had occurred, no matter how obvious the good-bad moral divide. Journalists and even critics were discouraged from indulging sentimentalism and moral arbitration. Editors believed available facts should govern outrage or empathy. Even cultural critics were weaned away from the “I sometimes think…” approach, told to put the vagaries of personality on hold and use well-enunciated views, not first person intervention, to establish mood.

It was 1960s color television, whose true-to-life images produced emotional lurches, if not hysterics, that revived the personal stage. At bay for several “objective” decades, the first person began making a comeback. Time (and its competitor Newsweek) abandoned its longstanding authorial anonymity and began publishing bylines, explaining the decision in terms of public accountability (it was instead based on commerce, to give specific writers celebrity and notoriety, and, publishers hoped, a following). Reality and celebrity television — which “I am the greatest” boxer Muhammad Ali had presaged with his post-bout boasts, considered offensive at the time — ramped up the process with howl-talk programs that made the first person the only active voice. Since then, the web, an offspring such programming, has codified the “I,” since self-transmission is contingent on me-first priorities.

A generation of news followers has come to see first person intervention as centerpiece in the dissemination of information. The passion the objective period worked to lessen, attempting instead to create and police a “cool” zone, has yielded to a fervor-first approach. Evidence is emphasis, not context, coaxing and promoting an I-for-Immediacy approach. The funds available to periodicals for investigative journalism, often a lengthy, tedious, and anonymous process, are dwindling fast.

More importantly, the once-valued cultural affinity between disinterest and objectivity (or an attempt at objectivity) plays a dwindling role in the day-to-day communication of current events, whether the broadcaster is a major network or Tweet-maker. Olympic coverage is drenched in nationalist vitriol and spiked with uplifting morality tales. The passionate seems more truthful because it’s felt, a parlous equivalency.

The potential for human indecency, from which a shocked post-World War II public recoiled, pondering the risks of emotions run rampant, lacks relevance in a society that prizes layers of loudness. The tone of bipartisan political advertising ahead of the November election is not so much negative as vile, a difference few any longer know or care to acknowledge.

The existing verbal mood, with the mongrel “I” present in all presentations, is eerily similar to that of the early 20th-century, when tycoon-owned newspapers first fathomed and parlayed the power “read all about it” headlines.

Red Baiting demagogue Joe McCarthy revived that frenzy in the 1950s, hijacking television until a small cadre of broadcasters fought back by disrupting assumptions a cowed and jaded public had mostly embraced. No such cadre exists today.

In director James Whales’ 1931 version of “Frankenstein,” an uneasy Dr. Waldman famously tells Dr. Henry Frankenstein that he has created a monster. A less quoted line is Frankenstein’s self-assured response: “Patience, patience. I believe in this Monster, as you call it. And if you don’t, well, you must leave me alone.”

About the Author:

Christopher P. Winner
Christopher P. Winner is a veteran American journalist and essayist who was born in Paris and has lived in Europe for more than 30 years.

Share This

Share this post with your friends!