"So long, farewell, auf wiedersehen, goodbye," goes the lyric from that song from "The Sound of Music," a tune lately discovering new life in a commercial for Kia automobiles.
Less musically than the Trapp Family Singers or the Kia crew, we're also saying some farewells, particularly to a year that, twisting a phrase in Latin made by Queen Elizabeth II some years back, was sure one horrible anus.
In no particular order of derision, then, joyful goodbyes to:
Donald Rumsfeld, the father of all ideologically-driven secretaries of defense. Were you one of the biggest obstacles to a clear-eyed foreign policy concerning Iraq? Were you needlessly divisive and polarizing? Are we glad as hell you’re gone? You bet.
Saddam Hussein: You found out the hard way that there is no god but God, and that don’t include you. Time to go lord it over the minions of the underworld. You're all done; don't let the gates of hell hit you in the ass on your way in.
Slobodan Milosevic: You earned your nickname, the Butcher of the Balkans. More than 200,000 people died as you cultivated old Serbian animosities into a decade-long civil war. And then you had nerve enough to die during your own war-crimes trial in The Hague – the ultimate escape from justice. In this world, anyway.
Augusto Pinochet: More than 3,000 human beings disappeared under your regime in Chile, after you overthrew a democratically-elected government. Thank you, at long last, for doing the same.
Tom DeLay: You’re now free to pursue other meaningful employment in the private sector. Perhaps a return to your former line of work in the pest control business is in order.
George Allen: You at least had the grace not to contest the election that bounced you out of office. Relax, take the Trent Lott road to contrition and maybe we’ll hear from you in a year or three, you macaca, you.
Karl Rove: There’s your math and there’s The Math. Your class in Remedial Political Arithmetic 101 starts when the new Congress does. Don’t cut class.
Bode Miller: The poster boy for Olympic athletic excess. America’s presumptive golden boy at the February Games in Torino, he entered five events and didn’t medal in a single one. Interviewed by Newsweek’s Devin Gordon, Miller apparently didn’t really want to compete at the Games. Note to Bode: Next time you don’t give a damn, don’t show up.
Michael Richards: Anonymity – and the millions you’ll earn forever from your share of the “Seinfeld” fortune – has to be better than a standup career that needs the N-word. We’ve heard Lenny Bruce, and you’re no Lenny Bruce.
Kenneth Lay: We were surprised to hear you died in Aspen of the effects of a bad heart; thousands of Enron employees doubted you even had one.
James Frey: Let’s get it straight. There’s a fiction shelf for books and a nonfiction shelf for books. It’s that way for a reason. Failure to understand that has a way of shattering a reputation into, well, a million little pieces.
Mike Nifong: The Duke lacrosse-team rape case was a sorry way to build your career as a prosecutor. But cheer up -- with the charges of ethics violations you’re facing, you’ve got a promising career ahead of you as a K Street lobbyist, evangelical minister or a Republican congressman. What a great country.
Bill Frist: The Tennessee senator and stock trading enthusiast wisely stepped out of politics and ended speculation on taking a run for the big chair in 2008. We’re guessing that maybe he's gone back into private medical practice. Good move -- you can’t heal the patient when you’re part of the disease.
Others aren’t leaving the scene but we wish to God they were:
Paris Hilton: Shut up. Just shut up.
Britney Spears: Shut up. Please shut up. And buy some underwear.
Bill O’Reilly and Ann Coulter: Shut the fuck up.
O.J. Simpson, Pamela Anderson and K-Fed: Jesus, are you still here?
The word “truthiness”: All respect to Stephen Colbert, but this isn’t the contribution to the American lexicon he’s been led to believe it is. If truthiness means the politic blurring of fact and fiction, there must be lots of other suitable words that already exist. Hell, you’ll find most of ‘em in James Frey’s book.
There are other goodbyes we’d rather not be saying at all. With tears, all props & much love, we kiss goodbye to:
Robert Altman: Our favorite maverick of the movies. In a body of work that stretched from the ‘50s up to the work in progress he was developing before he died, Altman insisted on doing it his way, raising hell every step of the way.
Syd Barrett: Singer-songwriter, guitarist, cracked founding genius of Pink Floyd: Remember when you were young, you shone like the sun. Shine on you crazy diamond. Now there's a look in your eyes, like black holes in the sky. Shine on you crazy diamond. You were caught on the crossfire of childhood and stardom, blown on the steel breeze. Come on you target for faraway laughter, come on you stranger, you legend, you martyr, and shine!
Ed Bradley: He was tough, he was smart, he was cool, he was elegant, he was streetwise and no man in the history of television ever looked as good with an earring and a beard. Bradley broke racial barriers at CBS News in particular and in journalism in general, creating a powerful body of work in his quarter-century on “60 Minutes.” When he died on Nov. 9 of leukemia at Mount Sinai hospital in New York City at the age of 65, his friend Jimmy Buffett told him that the New York Knicks and the Democrats had both won. Bradley smiled, shortly before letting go of this life -- a life that won't be the same without him.
Gerald Boyd: Another trailblazer for journalists of darker hue, Gerry Boyd, a former managing editor of The New York Times, died on Nov. 23, of lung cancer. Boyd had the experience of too many black and brown journalists in this country: being The First. He was the first black metropolitan editor and managing editor at The Times. At 28, he was also the youngest journalist picked for a Nieman Fellowship at Harvard. But his firstness helped a lot of people become one of many in a business that still has too few.
Ahmet Ertegun: The Turk who favored bespoke suits and rock and roll, a founder of both Atlantic Records and the Rock and Roll Hall of Fame, died Dec. 14 after head injuries from a fall he took backstage during the Rolling Stones concert at President Bill Clinton's 60th birthday party at New York's Beacon Theatre in October. A native of Turkey, he had the ears for music we wish we had, seeking out the talents of talents from Clyde McPhatter, the Coasters, Solomon Burke and Aretha Franklin to Ray Charles, Led Zeppelin and the Rolling Stones. Our friend and former colleague Jon Pareles of The New York Times said it best: “[H]e was an outsider who had become something more than an insider, an American phenomenon who proved the best way to cross boundaries was with the promise of a good time.”
Buck O’Neil: He was a two-time Negro Leagues batting champion, baseball scout, war veteran, manager of the Kansas City Monarchs, the first black coach in the major leagues. But most Americans alive today remember John Jordan O'Neil as the gentle, folksy paterfamilias of Ken Burns' 1994 documentary "Baseball." In February 2006, it was thought O’Neil was headed for the Baseball Hall of Fame. But the game he loved -- or the sportswriters who only wrote about the game he loved -- stiffed him. A special 12-person committee commissioned to render final judgments on Negro Leagues and pre-Negro league figures did not vote him into Cooperstown. Buck was a class to the end: “Shed no tears for Buck,” he told friends that day. not going into the Hall of Fame, that ain’t going to hurt me that much, no. Before, I wouldn’t even have a chance. But this time I had that chance.” Was there a classier act in the game?
Samuel James Archibald: He was just Sam to hundreds of journalism students (including your humble narrator). The former political reporter and congressional aide who helped craft the landmark Freedom of Information Act of 1966. After 10 years of reporting on politics for The Sacramento Bee, Archibald joined the staff of newly elected Democratic Rep. John Moss of California. Angered by federal bureaucrats' refusals to release names of citizens fired for allegedly being communists, Moss formed a subcommittee that looked into government secrecy and made Archibald staff director. Archibald is credited with writing the original one-paragraph draft that stated -- with a simple elegance that has terrified bureaucrats ever since -- that all government information must be free and available to the public. After numerous exceptions and qualifications were added during an 11-year battle for enactment, during which Archibald served as ramrod, the measure became law in 1966. If you've ever read a government report since then, you can thank Sam Archibald for trying to make sure what's in it isn't truthiness, but truth.
Mal Deans: Malcolm "Mal" Deans, the only senior instructor emeritus at the University of Colorado School of Journalism, brought a scrappy, wise demeanor to the practice of teaching journalism. The School's student-produced newspaper, the Campus Press, dates back to when Deans created it so that CU journalism students could get hands-on experience. He also established a program that allowed seniors to work at local newspapers. It was Mal Deans who put in a good word for me with the brain trust at the Boulder Daily Camera in 1979, right before I graduated. Next thing I knew, I had my first job in journalism waiting for me before I graduated. You hear talk of guardian angels; mine had a fondness for scrapple, small talk and faith in skinny undergrads who loved the written word -- and who, thanks to him, still do.
Michael Britten: Friend, fellow writer and tireless fan of the blues and other music, Michael wrote reviews for Salon and other publications, investing his pieces with flair and insight. He gifted me with music I'd never heard before, and seemed to have an almost encyclopedic knowledge of that music, knowledge he shared generously and with a good humor that belied the relentless progression of the multiple sclerosis that ultimately took him.
There were others, of course. Gordon Parks. Ann Richards. Wilson Pickett. Bebe Moore Campbell. … and of course a celestial shout-out is due for Moms, Wilhelmina Sherman Ross, mother of all mothers, who crossed over the wide water this year and whom we love like our next breath, and miss terribly. There’s no pain like that loss, no emptiness you can imagine that matches the vacancy of self that emerges when your mother dies. It is another kind of darkness visible.
“It is easy to see the beginnings of things, and harder to see the ends.” wrote Joan Didion, in her essay “Goodbye to All That,” chronicling another kind of departure -- one of passion and spirit for a place. It is no stretch to suggest that, with the year about to expire as evidence, it is easier to embrace a year's beginnings, its intrinsic possibilities, than to cope with its tired, slouching ends. It's a matter of dealing with unexpected disillusionment arriving in unexpected ways.
Now, bidding farewell to a year of anything but magical thinking (Didion’s memoir a notable exception), we feel much the same. There are things we’ll miss, situations we wish we’d handled better, faces and smiles we wish we’d acknowledged when we had the chance. But mostly, ours is a wistfulness best savored not in front of us but looking through the rear-view mirror, everything we’d like to forget receding fast in the dusty distance, objects not closer than they actually appear --
Eyes on the road ahead, now, hands on the wheel, the fireworks, new faces and noisemakers dead ahead.
They’re dropping the ball at midnight, and when it falls, it’s in our court.
Let's get it started.
-----
Image credits: Tom DeLay: U.S. Congress (public domain); Syd Barrett: Flupe.com; Ed Bradley: CBS News; Gerald Boyd: photographer unknown
Sunday, December 31, 2006
3,000+
There's no escaping the brutal geographic poetic justice in the news: On the last day of 2006, at the end of the deadliest month for the American military in Iraq in the preceding twelve, the Pentagon announced the death of the 3,000th American soldier in Iraq.
Spc. Dustin R. Donica, 22, of Spring, Texas, was killed Thursday by small arms fire in Baghdad, the Defense Department said.
Donica was one of at least 111 U.S. service members reported to have died in December.
The poetic justice? Donica hailed from Texas, the home state of President Bush. One can't help but consider a possible scenario, something that might happen years from now. Bush, oblivious to the connection, passes through Spring, Texas and stops to press the flesh of his kindred Texans.
The president will be received warmly, with due respect for the office, if not the former officeholder. But some will question the ex-president, with their eyes if not their actual voices. "Why, Mr. President? Why, sir? Why isn't Dustin Donica here among us today?" And no answer will be sufficient. No answer will be answer enough.
We should note that, the chaos of war being what it is, there is at least one other census of American dead, an accounting that makes another soldier the 3000th casualty of an unnecessary war.
According to a survey by CNN, the death of Sgt. Edward Shaffer, of Mount Alto, Pa., was the 3,000th American military fatality reported since the invasion began in March 2003.
Shaffer, all of 23, was wounded on Nov. 13 by a roadside bomb in Ramadi, the restive western Iraqi city where U.S. troops and insurgents trade fire on a near-daily basis, CNN reported.
But even with Shaffer's hometown, the poetic justice still obtains: Shaffer died at Fort Sam Houston ... Texas.
The deaths of Donica and Shaffer are the latest in a grim accounting: The American death toll was at 1,000 in September of 2004 and 2,000 by October 2005.
And it doesn't matter which of these tragically noble patriots was No. 3,000 -- whether the two stone-faced officers of the official notification detail ring a doorbell in Texas or Pennsylvania or the other forty-eight states, or its territories. Our needless national agony persists, from one year to the next. We are weaker as a nation, one by one by one.
----
Image credit: Department of Defense (public domain)
Spc. Dustin R. Donica, 22, of Spring, Texas, was killed Thursday by small arms fire in Baghdad, the Defense Department said.
Donica was one of at least 111 U.S. service members reported to have died in December.
The poetic justice? Donica hailed from Texas, the home state of President Bush. One can't help but consider a possible scenario, something that might happen years from now. Bush, oblivious to the connection, passes through Spring, Texas and stops to press the flesh of his kindred Texans.
The president will be received warmly, with due respect for the office, if not the former officeholder. But some will question the ex-president, with their eyes if not their actual voices. "Why, Mr. President? Why, sir? Why isn't Dustin Donica here among us today?" And no answer will be sufficient. No answer will be answer enough.
We should note that, the chaos of war being what it is, there is at least one other census of American dead, an accounting that makes another soldier the 3000th casualty of an unnecessary war.
According to a survey by CNN, the death of Sgt. Edward Shaffer, of Mount Alto, Pa., was the 3,000th American military fatality reported since the invasion began in March 2003.
Shaffer, all of 23, was wounded on Nov. 13 by a roadside bomb in Ramadi, the restive western Iraqi city where U.S. troops and insurgents trade fire on a near-daily basis, CNN reported.
But even with Shaffer's hometown, the poetic justice still obtains: Shaffer died at Fort Sam Houston ... Texas.
The deaths of Donica and Shaffer are the latest in a grim accounting: The American death toll was at 1,000 in September of 2004 and 2,000 by October 2005.
And it doesn't matter which of these tragically noble patriots was No. 3,000 -- whether the two stone-faced officers of the official notification detail ring a doorbell in Texas or Pennsylvania or the other forty-eight states, or its territories. Our needless national agony persists, from one year to the next. We are weaker as a nation, one by one by one.
----
Image credit: Department of Defense (public domain)
Dashikis for George
We’ve been a reliable critic of President Bush and his administration in a wide range of areas, and count on it, we will be again. But the Dec. 31 edition of The Washington Post gives light to a side of the Bush doctrine that's had scant public attention – something that demands credit be given when and where it’s due.
The Post reports that Bush “has tripled direct humanitarian and development aid to the world's most impoverished continent since taking office and recently vowed to double that increased amount by 2010 -- to nearly $9 billion.”
Wait a minute, you say. This is our President Bush? The one who ushered us into disastrous war in Iraq under dubious circumstances? The one who just outpointed Satan, Osama bin Laden and Kim Jong Il in an AP/AOL poll on the top villain of 2006?
True enough. The Post story is a surprise, and coming at the end of a year we’d all just as soon forget, not an unwelcome one.
“The moves have surprised -- and pleased -- longtime supporters of assistance for Africa, who note that because Bush has received little support from African American voters, he has little obvious political incentive for his interest,” the Post reports.
"I think the Bush administration deserves pretty high marks in terms of increasing aid to Africa," said Steve Radelet, a senior fellow at the Center for Global Development.
Citing the Paris-based Organization for Economic Cooperation and Development, the Post reports that Bush “has increased direct development and humanitarian aid to Africa to more than $4 billion a year from $1.4 billion in 2001,. And four African nations -- Sudan, Ethiopia, Egypt and Uganda -- rank among the world's top 10 recipients in aid from the United States.”
Besides just writing fat checks to Africa, Bush has also done the necessary face time, meeting with nearly three dozen African heads of state during his six years in office, the Post says.
The Post story notes one curious disconnect, given all this hands-across-the-water feeling for Africa; some activists, the Post says, “criticize Bush for not doing more to end the ongoing genocide in the Darfur region of Sudan …” That is odd, if you think about it. With Darfur’s potential for greater calamity than that which we’ve already seen, doing something there would seem to make an even more potent statement about America’s humanitarian intentions.
But the Post notes that others give him due props “for playing a role in ending deadly conflicts in Liberia, the Congo and other parts of Sudan. Meanwhile, Bush has overseen a steady rise in U.S. trade with Africa, which has doubled since 2001.”
"The evangelical community raised the awareness of HIV and AIDS to the president," said Rep. Donald M. Payne (N.J.), the top-ranking Democrat on the House International Relations subcommittee on Africa. "When the Bush administration came in, HIV and AIDS were not an overwhelming priority. Now we have seen a total metamorphosis," Payne told the Post.
You remember the old saying: If you can’t say something nice about someone, don’t say anything at all. U.S. presidents are exempt from that, of course, whether they like it or not. But it’s nice to be able to say something nice about someone who can really use it.
Get that man a daishiki.
-----
Image credit: Office of the President (public domain)
The Post reports that Bush “has tripled direct humanitarian and development aid to the world's most impoverished continent since taking office and recently vowed to double that increased amount by 2010 -- to nearly $9 billion.”
Wait a minute, you say. This is our President Bush? The one who ushered us into disastrous war in Iraq under dubious circumstances? The one who just outpointed Satan, Osama bin Laden and Kim Jong Il in an AP/AOL poll on the top villain of 2006?
True enough. The Post story is a surprise, and coming at the end of a year we’d all just as soon forget, not an unwelcome one.
“The moves have surprised -- and pleased -- longtime supporters of assistance for Africa, who note that because Bush has received little support from African American voters, he has little obvious political incentive for his interest,” the Post reports.
"I think the Bush administration deserves pretty high marks in terms of increasing aid to Africa," said Steve Radelet, a senior fellow at the Center for Global Development.
Citing the Paris-based Organization for Economic Cooperation and Development, the Post reports that Bush “has increased direct development and humanitarian aid to Africa to more than $4 billion a year from $1.4 billion in 2001,. And four African nations -- Sudan, Ethiopia, Egypt and Uganda -- rank among the world's top 10 recipients in aid from the United States.”
Besides just writing fat checks to Africa, Bush has also done the necessary face time, meeting with nearly three dozen African heads of state during his six years in office, the Post says.
The Post story notes one curious disconnect, given all this hands-across-the-water feeling for Africa; some activists, the Post says, “criticize Bush for not doing more to end the ongoing genocide in the Darfur region of Sudan …” That is odd, if you think about it. With Darfur’s potential for greater calamity than that which we’ve already seen, doing something there would seem to make an even more potent statement about America’s humanitarian intentions.
But the Post notes that others give him due props “for playing a role in ending deadly conflicts in Liberia, the Congo and other parts of Sudan. Meanwhile, Bush has overseen a steady rise in U.S. trade with Africa, which has doubled since 2001.”
"The evangelical community raised the awareness of HIV and AIDS to the president," said Rep. Donald M. Payne (N.J.), the top-ranking Democrat on the House International Relations subcommittee on Africa. "When the Bush administration came in, HIV and AIDS were not an overwhelming priority. Now we have seen a total metamorphosis," Payne told the Post.
You remember the old saying: If you can’t say something nice about someone, don’t say anything at all. U.S. presidents are exempt from that, of course, whether they like it or not. But it’s nice to be able to say something nice about someone who can really use it.
Get that man a daishiki.
-----
Image credit: Office of the President (public domain)
Thursday, December 28, 2006
Ford, lately
“His life was filled with love of God, his family and his country,” read the statement from the family of Gerald Rudolph Ford, the 38th President of the United States, who died Tuesday evening at his home in Rancho Mirage, Calif., at the age of 93.
He was our first and only accidental president, assuming the position after Richard Nixon resigned in disgrace after Watergate, and for just under nine hundred days he took the reins of power, probably expected by many to be not much more than a caretaker, someone to watch the house until the rightful tenants moved back in, whoever they might have been.
Ford – Eagle Scout, former Michigan football standout, stalwart of the House of Representatives, member of the Warren Commission, Vice President – rose to the unexpected occasion, asserting himself as a leader with a clarity of purpose and sense of duty that are truly clear only now, in retrospect, seen in the rear-view mirror as we drive – hurtle – toward our current uncertain future.
But for all his methodical and decisive aspects as president, Ford governed pursuant to the law of unintended consequences. He doomed his chances to be president in his own right, ironically enough, by suggesting to the American people that he could not be relied on act in his own right. With one Sunday morning statement uttered a month into the accidental presidency, Gerald Ford ended his political career; the wounds wouldn’t show for another eight hundred sixty days.
“Our long national nightmare is over,” Ford said after being sworn in in August 1974. “Our government is one of laws, and not of men,” the new president said, not realizing that a month later he would be accused of acting precisely the same way.
On Sept. 8, 1974, Ford extended to Richard Nixon a “full, free and absolute pardon” for any “crimes he committed or may have committed” during his tenure as President of the United States. This get-out-of-jail card from the gods was meant, Ford said for years afterward, to close the books on the Watergate debacle once and for all.
But by circumventing the procedure of congressional oversight and review, by unilaterally deciding that Nixon would not face charges for high crimes and misdemeanors, Ford effectively assumed the role of the government of a man abrogating unto itself the role of a government of laws. For all its noble intent, his pardon seems especially confusing given Ford’s reverence for the deliberative powers of Congress – the constitutionally-guaranteed process of analysis and judgment that would have played itself out, however painfully, in an impeachment trial.
It was a decision that Ford would pay for, in the 1976 election, when the elephant’s memory of the American people asserted itself to deny Ford the presidency in his own right.
Ford’s pardon of Nixon tarnished his role as conciliator in one respect; less widely reported in the recent postmortems is the fact that, later that month in 1974, Ford issued a clemency plan for those who evaded the draft during the Vietnam War – an act that, while just as potentially divisive as the Nixon pardon, sent the signal that Ford was serious about doing what he could to heal the country.
In many recollections of President Ford that have already surfaced on the cable shows, there’s a tendency to use the word “decent” to describe Gerald Ford. “Good” and “honorable” are also used as ways to describe the president who sought to “humanize” the presidency.
And this is the source of George Bush’s problem. Such generous posthumous assessments of President Ford and his presidency will be, inevitably, contrasted with less generous views for President Bush and his administration. No one will actually mention George Bush’s name in apposition to Gerald Ford’s in a comparison of personalities. But that’s the unspoken takeaway: Gerald Ford was a good and decent man as a leader (unlike – ahem! – at least one of his successors). The comparison of the two leaders is nakedly implicit.
Bush doesn’t need this, doesn’t need anything to take the public eye off the ball of the aggressive, bellicose, largely defensive style of Republicanism that he and his administration have perfected for the last half dozen years.
Something else Bush didn’t need is an expression of opinion Ford left behind – a statement not exactly from the grave but one embargoed until he got there.
In July 2004 Gerald Ford had an interview with Bob Woodward of The Washington Post, an interview that, we now know, burnished Ford’s reputation as a man who spoke his mind without spin and nuances. By agreement, Woodward withheld publication of the interview until after Ford’s death.
When Woodward asked Ford about his feelings about the Iraq war, Ford expressed what could be, uh, conservatively called strong doubts. Some excerpts:
"I don’t think if I had been president, on the basis of the facts as I saw them publicly, I don’t think I would have ordered the Iraqi war. I would have maximized our effort through sanctions, through restrictions, whatever, to find another answer.
"I’ve never publicly said I thought they made a mistake, but I felt very strongly it was an error in how they should justify what they were going to do.
“And I just don’t think we should go hellfire and damnation around the globe freeing people unless it’s directly related to our own national security.”
The spinmeisters in the Bush White House will of course try to twist that last statement, saying that, yes, this is one of those missions that is related to our national security. But the overall context of Ford’s comments will be clear and unmistakable and untweakable. The words “big mistake” speak eloquently for themselves.
And when the ceremonies for President Ford get underway this weekend – ceremonies that President Bush is obligated to lead – they’ll take place against a backdrop of anti-war sentiments that get more and more bipartisan all the time. Ford’s assessment of the Iraq war is no less acute and insightful because he’s dead than it would have been if he’d made the comments available for publication when he was alive.
We can thank Gerald Ford for such honesty, and for other humanizing aspects of his brief time in the White House. Yeah, he fumbled, he stumbled, he was prone to goofs and gaffes and contradictions, one of which cost him an election. But Gerry Ford was president in, well, a kinder, gentler American time. He’ll be remembered for closing the chapter on Watergate, biting the bullet on the Vietnam War, dismantling the imperial dimensions of the American presidency, and at least trying to engage in a relatively clear-eyed, pragmatic approach to governing, one that’s sorely lacking today.
And after ending one long national nightmare, Ford was candid enough to offer a later generation some needed perspective on another one: the enduring bad dream that persists – three thousand American lives lost (and counting), twenty-two thousand American lives wounded and damaged (and counting), four hundred billion dollars evaporated (and counting) – to this day.
There’s been a lot of talk in recent years, not much of it that serious, of adding the visage of Ronald Reagan to the faces on Mount Rushmore. Reagan, the thinking goes, ushered American out of a grim era of self-doubt and division, restoring the nation’s confidence in itself.
So did Gerry Ford, and Ford did it first, and Ford did it without the residual stardust of Hollywood on his shoulders, unlike Reagan. Gerald Ford was more like us than we knew, or would admit in public, when he was alive. Maybe Ford’s image would be better for Mount Rushmore – an everyman face to represent our humbler, more anonymous American aspect.
NATION TO FORD: THANKS.
-----
Image credits: Top: David Hume Kennerly, White House (public domain); Bottom: White House (public domain)
He was our first and only accidental president, assuming the position after Richard Nixon resigned in disgrace after Watergate, and for just under nine hundred days he took the reins of power, probably expected by many to be not much more than a caretaker, someone to watch the house until the rightful tenants moved back in, whoever they might have been.
Ford – Eagle Scout, former Michigan football standout, stalwart of the House of Representatives, member of the Warren Commission, Vice President – rose to the unexpected occasion, asserting himself as a leader with a clarity of purpose and sense of duty that are truly clear only now, in retrospect, seen in the rear-view mirror as we drive – hurtle – toward our current uncertain future.
But for all his methodical and decisive aspects as president, Ford governed pursuant to the law of unintended consequences. He doomed his chances to be president in his own right, ironically enough, by suggesting to the American people that he could not be relied on act in his own right. With one Sunday morning statement uttered a month into the accidental presidency, Gerald Ford ended his political career; the wounds wouldn’t show for another eight hundred sixty days.
“Our long national nightmare is over,” Ford said after being sworn in in August 1974. “Our government is one of laws, and not of men,” the new president said, not realizing that a month later he would be accused of acting precisely the same way.
On Sept. 8, 1974, Ford extended to Richard Nixon a “full, free and absolute pardon” for any “crimes he committed or may have committed” during his tenure as President of the United States. This get-out-of-jail card from the gods was meant, Ford said for years afterward, to close the books on the Watergate debacle once and for all.
But by circumventing the procedure of congressional oversight and review, by unilaterally deciding that Nixon would not face charges for high crimes and misdemeanors, Ford effectively assumed the role of the government of a man abrogating unto itself the role of a government of laws. For all its noble intent, his pardon seems especially confusing given Ford’s reverence for the deliberative powers of Congress – the constitutionally-guaranteed process of analysis and judgment that would have played itself out, however painfully, in an impeachment trial.
It was a decision that Ford would pay for, in the 1976 election, when the elephant’s memory of the American people asserted itself to deny Ford the presidency in his own right.
Ford’s pardon of Nixon tarnished his role as conciliator in one respect; less widely reported in the recent postmortems is the fact that, later that month in 1974, Ford issued a clemency plan for those who evaded the draft during the Vietnam War – an act that, while just as potentially divisive as the Nixon pardon, sent the signal that Ford was serious about doing what he could to heal the country.
In many recollections of President Ford that have already surfaced on the cable shows, there’s a tendency to use the word “decent” to describe Gerald Ford. “Good” and “honorable” are also used as ways to describe the president who sought to “humanize” the presidency.
And this is the source of George Bush’s problem. Such generous posthumous assessments of President Ford and his presidency will be, inevitably, contrasted with less generous views for President Bush and his administration. No one will actually mention George Bush’s name in apposition to Gerald Ford’s in a comparison of personalities. But that’s the unspoken takeaway: Gerald Ford was a good and decent man as a leader (unlike – ahem! – at least one of his successors). The comparison of the two leaders is nakedly implicit.
Bush doesn’t need this, doesn’t need anything to take the public eye off the ball of the aggressive, bellicose, largely defensive style of Republicanism that he and his administration have perfected for the last half dozen years.
Something else Bush didn’t need is an expression of opinion Ford left behind – a statement not exactly from the grave but one embargoed until he got there.
In July 2004 Gerald Ford had an interview with Bob Woodward of The Washington Post, an interview that, we now know, burnished Ford’s reputation as a man who spoke his mind without spin and nuances. By agreement, Woodward withheld publication of the interview until after Ford’s death.
When Woodward asked Ford about his feelings about the Iraq war, Ford expressed what could be, uh, conservatively called strong doubts. Some excerpts:
"I don’t think if I had been president, on the basis of the facts as I saw them publicly, I don’t think I would have ordered the Iraqi war. I would have maximized our effort through sanctions, through restrictions, whatever, to find another answer.
"I’ve never publicly said I thought they made a mistake, but I felt very strongly it was an error in how they should justify what they were going to do.
“And I just don’t think we should go hellfire and damnation around the globe freeing people unless it’s directly related to our own national security.”
The spinmeisters in the Bush White House will of course try to twist that last statement, saying that, yes, this is one of those missions that is related to our national security. But the overall context of Ford’s comments will be clear and unmistakable and untweakable. The words “big mistake” speak eloquently for themselves.
And when the ceremonies for President Ford get underway this weekend – ceremonies that President Bush is obligated to lead – they’ll take place against a backdrop of anti-war sentiments that get more and more bipartisan all the time. Ford’s assessment of the Iraq war is no less acute and insightful because he’s dead than it would have been if he’d made the comments available for publication when he was alive.
We can thank Gerald Ford for such honesty, and for other humanizing aspects of his brief time in the White House. Yeah, he fumbled, he stumbled, he was prone to goofs and gaffes and contradictions, one of which cost him an election. But Gerry Ford was president in, well, a kinder, gentler American time. He’ll be remembered for closing the chapter on Watergate, biting the bullet on the Vietnam War, dismantling the imperial dimensions of the American presidency, and at least trying to engage in a relatively clear-eyed, pragmatic approach to governing, one that’s sorely lacking today.
And after ending one long national nightmare, Ford was candid enough to offer a later generation some needed perspective on another one: the enduring bad dream that persists – three thousand American lives lost (and counting), twenty-two thousand American lives wounded and damaged (and counting), four hundred billion dollars evaporated (and counting) – to this day.
There’s been a lot of talk in recent years, not much of it that serious, of adding the visage of Ronald Reagan to the faces on Mount Rushmore. Reagan, the thinking goes, ushered American out of a grim era of self-doubt and division, restoring the nation’s confidence in itself.
So did Gerry Ford, and Ford did it first, and Ford did it without the residual stardust of Hollywood on his shoulders, unlike Reagan. Gerald Ford was more like us than we knew, or would admit in public, when he was alive. Maybe Ford’s image would be better for Mount Rushmore – an everyman face to represent our humbler, more anonymous American aspect.
NATION TO FORD: THANKS.
-----
Image credits: Top: David Hume Kennerly, White House (public domain); Bottom: White House (public domain)
Tuesday, December 26, 2006
Mr. Soul
So we limp into Christmas morning fairly sure that, with less than a week left in the year, things can’t get no worse. We think we’ve heard from every sad and tragic situation from everywhere on earth and we’re mentally ready, at least, to hunker down, burrow in, hibernate in winter’s fog until the new year. And then we turn on the TV first thing to find out what went down while we were sleeping. And we find out what went down while we were sleeping.
Early that morning at Emory Crawford Long Hospital in Atlanta, James Brown, JB, Mr. Dynamite, Soul Brother Number One, Godfather of Soul, the Hardest Working Man in Show Business, the Original Disco Man and the King of Funk, gave up the ghost, passing from this world to the next, dying of congestive heart failure complicated by pneumonia, at the age of 73.
When we lose any portion of the double helix of the American songbook – George Gershwin, Frank Sinatra, Ray Charles – whatever’s left is diminished, less powerful, less passionate. It goes without saying that we’re poorer this Christmas than we were before.
It might go without saying, but some people – those who knew him as friend and confidant – found ways to say it. “He was an innovator, he was an emancipator, he was an originator. Rap music, all that stuff came from James Brown,” Little Richard, a longtime friend, told MSNBC.
“James Brown changed music,” said Rev. Al Sharpton, another longtime friend and one who toured with him in the 1970s. “He made soul music a world music,” said Sharpton, one of the few black men in America with nerve enough to pull off wearing a pompadour today. “What James Brown was to music in terms of soul and hip-hop, rap, all of that, is what Bach was to classical music. This is a guy who literally changed the music industry. He put everybody on a different beat, a different style of music. He pioneered it.”
Generations were inspired by him; legions of rock & soul’s best talents copped his moves and attitude. JB was the template, the wellspring of funk from which everything flowed. From Mick Jagger to Michael Jackson, from Prince to the rappers and hiphop artists who sampled his signature shouts and shrieks – everyone stole from James Brown whether they knew it or not.
Want proof? Get a copy of the 1964 TAMI Show. In that concert, James Brown and the Famous Flames were the evening’s penultimate act; the embryonic version of the Rolling Stones were to close the show. But James & crew quite simply tore the stage up, JB doing his best dance moves as he fronted a furiously tight band, the perpetual motion machine bringing the crowd to near frenzy, ankles and hips swiveling at angles we had thought were anatomically impossible.
When Jagger and the Stones took the stage, Mick aped James’ best dance moves, doing his 22-year-old best but showing in his British white-boy style why imitation is the most sincere form of flattery (and that night the most hilarious, too).
Some of James Brown’s music dovetailed perfectly with a rising sense of black consciousness. The 1968 song “(Say It Loud) I’m Black and I’m Proud” was embraced by young black America in a time when outwardly assertive black pride was still a nascent phenomenon. James never shied away from his role as a race man, one of the major brothers on the scene no matter when the scene was.
“I clearly remember we were calling ourselves colored, and after the song, we were calling ourselves black,” Brown told The Associated Press in 2003. “The song showed even people to that day that lyrics and music and a song can change society.”
JB may have been a bit hyperbolic about all that. The move away from “Negro” and “colored” toward “black” was already well underway – had been for almost a year. In 1967 Stokely Carmichael, H. Rap Brown and others on the activist left had started the Black Power movement, coining a phrase that galvanized, though some will say divided, the civil rights movement. But James’ song became an anthem, taking the notion of forthright black pride from the college campuses and the grip of the intellectuals and putting it where it needed to be to really resonate: on the radio. This was social change you could dance to.
From the beginning, James Brown was a brother we could understand. James was folks, he was one of us from jump street. Born poor in Barnwell, S.C., in 1933, he was abandoned at the age of four to the care of relatives and friends. James grew up on the streets of Augusta, Ga., getting by every way he could: picking cotton, shining shoes, dancing for dimes in the streets of Augusta, doing the odd armed robbery when the need be.
That’s what got him sent up for three years in a juvy camp. After that he tried sports for a while, first as a boxer, then as a baseball pitcher. When injuries kept that from happening, Brown considered music. It was the desperation of the times for a young black man that made trying on so many guises, so many identities, necessary.
“I wanted to be somebody,” Brown said years later.
James grew up with Bobby Byrd, a friend whose family took him in. They started a group, the Famous Flames, a distillation of a gospel group they already belonged to. In 1956 some record label – the Associated Press says it was King Records, the Rolling Stone Encyclopedia of Rock & Roll says it was Federal Records -- signed the group, and four months later “Please, Please, Please” was in the R&B Top Ten.
For three decades after that, Brown toured almost nonstop, doing cross-country tours, trying out new songs at concerts and earning the title he gave himself, the Hardest Working Man in Show Business.
Part of that hard work was what he did onstage: raw, spontaneous energy punctuated with over-the-top stagecraft. If you went to a James Brown show in the 60’s, you got the full James Brown Experience: the dance moves that others would try at and fail for years, the precise 360-degree spins that found the microphone inches from where James left it when he started the pivot; the heroic splits; the energy of a beat channeled through the funky metronome at the front of the stage.
It went on until James, spent, knelt in exhaustion – and out came the brothers carrying the gold lame cape, covering the weary James and escorting him off the stage.
But then – no: James stops a moment, seemingly shivering under the cape, then he spins, shrugs off the cape and sprints back to the microphone … there was something he forgot to tell y’all. Thus a legend of long and furious encores was born. You can believe it when James said he lost five pounds or more during a show.
His group became a training ground for musicians who went on to their own acclaim: Hendrix played briefly with the Flames; so did members of George Clinton’s Parliament/Funkadelic hybrid; so did bassist Jack Casady before getting his boarding pass for the Jefferson Airplane.
His “Live at The Apollo, Volume 2” in 1962 is widely considered one of the greatest concert records ever and sold more than a million copies in an era when black records never did that well.
James won a Grammy in 1965 for “Papa’s Got a Brand New Bag” and for “Living In America” in 1987. He was one of the first artists inducted into the Rock and Roll Hall of Fame in 1986, with Chuck Berry and Elvis Presley.
Throughout his career, and despite the accolades, James burnished his cred as a brother’s brother. Songs like “King Heroin” and “Don’t Be a Dropout” held undeniable messages for young and restive black America. He sponsored empowerment programs for black kids when it wasn’t fashionable; in the flashpoint time after the Rev. Martin Luther King Jr. was assassinated in April 1968, about the time “Say It Loud” was on the charts, James went on television to calm things down, probably preventing a bad situation from getting worse.
The man’s life had no end of drama. In September 1988, ripped to the tits on PCP and brandishing a shotgun, Brown walked into an insurance seminar next to his Augusta office and asked the people there if they were using his private bathroom. After he left the building, police chased Brown for a half-hour from Augusta into South Carolina and back into Georgia in some wild, “Dukes of Hazzard” shit that ended when police shot out the tires of his truck. He did two years for that, for aggravated assault and failing to stop for a police officer.
For any other star at the age of 58, that might have been enough to bring on retirement. But James kept working, tweaking his show to pull in a younger generation while bearing true faith and allegiance to the funk, and the audience, that got him where he was. Not long after his release, James hit the stage again at a show you needed to pack a lunch to see: a three-hour, pay-per-view concert at the Wiltern Theatre in Los Angeles with an audience that included millions who watched on cable.
It’s sad that, toward the end, James became a parody of himself. In recent years a booking photo from one arrest became part of the pop-cultural photo gallery of stars behaving badly and looking worse (others so honored include Nick Nolte, Wynona Judd, Glenn Campbell and George Clinton).
But that was the minor bullshit, the asterisks and footnotes to a career that, for all practical purposes, continued to the day he checked out. The AP reported that, three days before his death, James joined volunteers at his annual toy giveaway in Augusta, and he planned to perform on New Year’s Eve at B.B. King’s Times Square blues club in New York.
James was consistent. The flamboyance of forty-some years of entrances and exits on stage was mirrored in his big exit on Christmas at 1:45 in the morning. “Almost a dramatic, poetic moment,” the Rev. Jesse Jackson told the AP. “He’ll be all over the news all over the world today. He would have it no other way.”
In any theoretical Mount Rushmore of musical culture, James, pompadour and all, would have to be chiseled high and large and in stark relief from everyone else.
In all the meaningful ways -- both in his message to a young black America desperate for self-worth and potential, and in his music, which bridged the racial divide like no speech or policy ever could -- the man who wanted to be somebody became The Somebody. And hot damn it, if you’re a musician working in rock, hiphop, funk or R&B, no matter how original you think you are, you need to hit at least one knee and thank your personal God for James Brown. One way or another, he made what you do possible.
And don’t feel bad if you never reach his level of Somebody.
Nobody ever will.
------
Image credits: The cape walk: © 2006 thetigerduck, London. Image used by fair use provisions under United States law. Apollo Memorial: Benjamen Walker (Flickr), licensed under Creative Commons Attribution 2.0 License > Wikipedia. The statue: Sir Mildred Pierce (Flickr), licensed under Creative Commons Attribution 2.0 License > Wikipedia.
Early that morning at Emory Crawford Long Hospital in Atlanta, James Brown, JB, Mr. Dynamite, Soul Brother Number One, Godfather of Soul, the Hardest Working Man in Show Business, the Original Disco Man and the King of Funk, gave up the ghost, passing from this world to the next, dying of congestive heart failure complicated by pneumonia, at the age of 73.
When we lose any portion of the double helix of the American songbook – George Gershwin, Frank Sinatra, Ray Charles – whatever’s left is diminished, less powerful, less passionate. It goes without saying that we’re poorer this Christmas than we were before.
It might go without saying, but some people – those who knew him as friend and confidant – found ways to say it. “He was an innovator, he was an emancipator, he was an originator. Rap music, all that stuff came from James Brown,” Little Richard, a longtime friend, told MSNBC.
“James Brown changed music,” said Rev. Al Sharpton, another longtime friend and one who toured with him in the 1970s. “He made soul music a world music,” said Sharpton, one of the few black men in America with nerve enough to pull off wearing a pompadour today. “What James Brown was to music in terms of soul and hip-hop, rap, all of that, is what Bach was to classical music. This is a guy who literally changed the music industry. He put everybody on a different beat, a different style of music. He pioneered it.”
Generations were inspired by him; legions of rock & soul’s best talents copped his moves and attitude. JB was the template, the wellspring of funk from which everything flowed. From Mick Jagger to Michael Jackson, from Prince to the rappers and hiphop artists who sampled his signature shouts and shrieks – everyone stole from James Brown whether they knew it or not.
Want proof? Get a copy of the 1964 TAMI Show. In that concert, James Brown and the Famous Flames were the evening’s penultimate act; the embryonic version of the Rolling Stones were to close the show. But James & crew quite simply tore the stage up, JB doing his best dance moves as he fronted a furiously tight band, the perpetual motion machine bringing the crowd to near frenzy, ankles and hips swiveling at angles we had thought were anatomically impossible.
When Jagger and the Stones took the stage, Mick aped James’ best dance moves, doing his 22-year-old best but showing in his British white-boy style why imitation is the most sincere form of flattery (and that night the most hilarious, too).
Some of James Brown’s music dovetailed perfectly with a rising sense of black consciousness. The 1968 song “(Say It Loud) I’m Black and I’m Proud” was embraced by young black America in a time when outwardly assertive black pride was still a nascent phenomenon. James never shied away from his role as a race man, one of the major brothers on the scene no matter when the scene was.
“I clearly remember we were calling ourselves colored, and after the song, we were calling ourselves black,” Brown told The Associated Press in 2003. “The song showed even people to that day that lyrics and music and a song can change society.”
JB may have been a bit hyperbolic about all that. The move away from “Negro” and “colored” toward “black” was already well underway – had been for almost a year. In 1967 Stokely Carmichael, H. Rap Brown and others on the activist left had started the Black Power movement, coining a phrase that galvanized, though some will say divided, the civil rights movement. But James’ song became an anthem, taking the notion of forthright black pride from the college campuses and the grip of the intellectuals and putting it where it needed to be to really resonate: on the radio. This was social change you could dance to.
From the beginning, James Brown was a brother we could understand. James was folks, he was one of us from jump street. Born poor in Barnwell, S.C., in 1933, he was abandoned at the age of four to the care of relatives and friends. James grew up on the streets of Augusta, Ga., getting by every way he could: picking cotton, shining shoes, dancing for dimes in the streets of Augusta, doing the odd armed robbery when the need be.
That’s what got him sent up for three years in a juvy camp. After that he tried sports for a while, first as a boxer, then as a baseball pitcher. When injuries kept that from happening, Brown considered music. It was the desperation of the times for a young black man that made trying on so many guises, so many identities, necessary.
“I wanted to be somebody,” Brown said years later.
James grew up with Bobby Byrd, a friend whose family took him in. They started a group, the Famous Flames, a distillation of a gospel group they already belonged to. In 1956 some record label – the Associated Press says it was King Records, the Rolling Stone Encyclopedia of Rock & Roll says it was Federal Records -- signed the group, and four months later “Please, Please, Please” was in the R&B Top Ten.
For three decades after that, Brown toured almost nonstop, doing cross-country tours, trying out new songs at concerts and earning the title he gave himself, the Hardest Working Man in Show Business.
Part of that hard work was what he did onstage: raw, spontaneous energy punctuated with over-the-top stagecraft. If you went to a James Brown show in the 60’s, you got the full James Brown Experience: the dance moves that others would try at and fail for years, the precise 360-degree spins that found the microphone inches from where James left it when he started the pivot; the heroic splits; the energy of a beat channeled through the funky metronome at the front of the stage.
It went on until James, spent, knelt in exhaustion – and out came the brothers carrying the gold lame cape, covering the weary James and escorting him off the stage.
But then – no: James stops a moment, seemingly shivering under the cape, then he spins, shrugs off the cape and sprints back to the microphone … there was something he forgot to tell y’all. Thus a legend of long and furious encores was born. You can believe it when James said he lost five pounds or more during a show.
His group became a training ground for musicians who went on to their own acclaim: Hendrix played briefly with the Flames; so did members of George Clinton’s Parliament/Funkadelic hybrid; so did bassist Jack Casady before getting his boarding pass for the Jefferson Airplane.
His “Live at The Apollo, Volume 2” in 1962 is widely considered one of the greatest concert records ever and sold more than a million copies in an era when black records never did that well.
James won a Grammy in 1965 for “Papa’s Got a Brand New Bag” and for “Living In America” in 1987. He was one of the first artists inducted into the Rock and Roll Hall of Fame in 1986, with Chuck Berry and Elvis Presley.
Throughout his career, and despite the accolades, James burnished his cred as a brother’s brother. Songs like “King Heroin” and “Don’t Be a Dropout” held undeniable messages for young and restive black America. He sponsored empowerment programs for black kids when it wasn’t fashionable; in the flashpoint time after the Rev. Martin Luther King Jr. was assassinated in April 1968, about the time “Say It Loud” was on the charts, James went on television to calm things down, probably preventing a bad situation from getting worse.
The man’s life had no end of drama. In September 1988, ripped to the tits on PCP and brandishing a shotgun, Brown walked into an insurance seminar next to his Augusta office and asked the people there if they were using his private bathroom. After he left the building, police chased Brown for a half-hour from Augusta into South Carolina and back into Georgia in some wild, “Dukes of Hazzard” shit that ended when police shot out the tires of his truck. He did two years for that, for aggravated assault and failing to stop for a police officer.
For any other star at the age of 58, that might have been enough to bring on retirement. But James kept working, tweaking his show to pull in a younger generation while bearing true faith and allegiance to the funk, and the audience, that got him where he was. Not long after his release, James hit the stage again at a show you needed to pack a lunch to see: a three-hour, pay-per-view concert at the Wiltern Theatre in Los Angeles with an audience that included millions who watched on cable.
It’s sad that, toward the end, James became a parody of himself. In recent years a booking photo from one arrest became part of the pop-cultural photo gallery of stars behaving badly and looking worse (others so honored include Nick Nolte, Wynona Judd, Glenn Campbell and George Clinton).
But that was the minor bullshit, the asterisks and footnotes to a career that, for all practical purposes, continued to the day he checked out. The AP reported that, three days before his death, James joined volunteers at his annual toy giveaway in Augusta, and he planned to perform on New Year’s Eve at B.B. King’s Times Square blues club in New York.
James was consistent. The flamboyance of forty-some years of entrances and exits on stage was mirrored in his big exit on Christmas at 1:45 in the morning. “Almost a dramatic, poetic moment,” the Rev. Jesse Jackson told the AP. “He’ll be all over the news all over the world today. He would have it no other way.”
In any theoretical Mount Rushmore of musical culture, James, pompadour and all, would have to be chiseled high and large and in stark relief from everyone else.
In all the meaningful ways -- both in his message to a young black America desperate for self-worth and potential, and in his music, which bridged the racial divide like no speech or policy ever could -- the man who wanted to be somebody became The Somebody. And hot damn it, if you’re a musician working in rock, hiphop, funk or R&B, no matter how original you think you are, you need to hit at least one knee and thank your personal God for James Brown. One way or another, he made what you do possible.
And don’t feel bad if you never reach his level of Somebody.
Nobody ever will.
------
Image credits: The cape walk: © 2006 thetigerduck, London. Image used by fair use provisions under United States law. Apollo Memorial: Benjamen Walker (Flickr), licensed under Creative Commons Attribution 2.0 License > Wikipedia. The statue: Sir Mildred Pierce (Flickr), licensed under Creative Commons Attribution 2.0 License > Wikipedia.
Sunday, December 17, 2006
The fangs of Katie Couric
If Perky wasn’t dead before, Perky’s sure as hell dead as fried chicken now. That’s the takeaway message both from Katie Couric’s smart reordering of story emphasis on the “CBS Evening News,” and from a published, teeth-bared reaction to her media critics. At this point some three-odd months into her ascension to the anchor chair, Couric’s finding her range as a television journalist, but she’s a bit more thin-skinned than the girl who woke us up so many mornings could ever be.
From the beginning, there were concerns (some of them aired here) about a features-heavy story play when she started. That’s pretty much given way to a story play more or less consistent with the weight of similar stories on the other nightly newscasts. Her lieutenants in the field – David Martin at the Pentagon, John Blackstone covering the American West, Mark Phillips in London, Lara Logan in Iraq – are solid professionals; the interplay between them and Couric seems more heartfelt than with Bob Schieffer, Couric’s predecessor who, for all his skill as a Washington gumshoe insider, didn’t handle pleasantries from correspondents or small talk all that well.
Set goes to Couric. But then it comes undone. She leads the Dec. 13 broadcast with a story on holiday shopping when other newscasts led, rightly, with the potential bombshell of the mystery illness of South Dakota Democratic Senator Tim Johnson, an illness whose worst-case presentation could throw the leadership of the United States Senate, and the pending shift of the balance of power to the Democrats, into complete disarray.
The viewing public is a creature of habit, just like anyone in it. To judge from the ratings, viewers are more comfortable with NBC’s Brian Williams or ABC’s Charles Gibson talking to them over dinner. Nielsen Media Research reported that, for the week of Dec. 4, NBC’s evening news broadcast culled 9.1 million viewers in the multichannel universe. ABC’s newscast pulled down 9 million viewers, and the “CBS Evening News with Katie Couric” garnered 7.5 million viewers – a fact that keeps CBS about where it’s been for some time, firmly in command of third place.
Maybe viewers still have a lingering memory of the Rather debacle, or how CBS tried a variety of anchor experiments before settling on Schieffer before settling down with Katie.
But two statements stand out as signs of Couric clearly bustin’ a death cap in Perky’s ass, and her probably growing frustration with her program’s inability to gain ground on the competition.
La Couric was quoted in James Wolcott’s column in the new issue of Vanity Fair, saying "[w]e kind of ignore people who are observing everything we do and praising, criticizing or analyzing it." That rather imperial diss arrived about the same time that Radar Online circulated excerpts of an interview Couric had with Tom Junod of Esquire magazine. You can almost see Katie’s canines emerging as she speaks, barely disguising exasperation. “You guys even take a shot at me. You have something in the November issue, something about how since I’ve become an anchor, you don’t know me anymore. You don’t know me anymore? Bite me.”
Katie Couric thus joins … [drumroll please] ...
THE PANTHEON OF CBS NEWS ANCHORS
WITH LEGENDARY QUOTES
Murrow: “Good night and good luck.”
Cronkite: “And that’s the way it is.”
Couric: “Bite me.”
Like any worthy professional in a position where results are evaluated on a week-to-week basis, Couric’s eager to make her mark sooner rather than later. But her sniping at Esquire magazine does more than reveal a petulant streak; it suggests that she doesn’t understand the nature of television. A medium that provides more or less immediate information can be expected to generate more or less immediate reactions to that information, and how it’s presented to people.
We’d be inclined to write it off to sophomore jitters if it were anyone other than Couric, an old hand in the medium, especially its’ morning talk-show wars. But with statements like that, she’s biting the medium that feeds her, and the public that feeds that medium. There’s a sense that she thinks the “CBS Evening News” is supposed to be immune from adverse reaction. If she really feels like that … maybe we don’t know her anymore.
Couric’s testiness is coming, ironically enough, when she’s finding the sweet spot as a TV journalist. Maybe now she’ll dial back her expectations of herself, just a little. When you’ve set those expectations so high, having essentially said you intend to reshape the known model of evening broadcast news, you’re off your meds if you don’t think viewers will react, and fast, favorably or otherwise.
The opening-day hoopla is over, Katie; now it’s time to settle down on the mound and pitch. Don’t forget, the audience is always the umpire. Always. And you’ve been in television long enough to know that.
From the beginning, there were concerns (some of them aired here) about a features-heavy story play when she started. That’s pretty much given way to a story play more or less consistent with the weight of similar stories on the other nightly newscasts. Her lieutenants in the field – David Martin at the Pentagon, John Blackstone covering the American West, Mark Phillips in London, Lara Logan in Iraq – are solid professionals; the interplay between them and Couric seems more heartfelt than with Bob Schieffer, Couric’s predecessor who, for all his skill as a Washington gumshoe insider, didn’t handle pleasantries from correspondents or small talk all that well.
Set goes to Couric. But then it comes undone. She leads the Dec. 13 broadcast with a story on holiday shopping when other newscasts led, rightly, with the potential bombshell of the mystery illness of South Dakota Democratic Senator Tim Johnson, an illness whose worst-case presentation could throw the leadership of the United States Senate, and the pending shift of the balance of power to the Democrats, into complete disarray.
The viewing public is a creature of habit, just like anyone in it. To judge from the ratings, viewers are more comfortable with NBC’s Brian Williams or ABC’s Charles Gibson talking to them over dinner. Nielsen Media Research reported that, for the week of Dec. 4, NBC’s evening news broadcast culled 9.1 million viewers in the multichannel universe. ABC’s newscast pulled down 9 million viewers, and the “CBS Evening News with Katie Couric” garnered 7.5 million viewers – a fact that keeps CBS about where it’s been for some time, firmly in command of third place.
Maybe viewers still have a lingering memory of the Rather debacle, or how CBS tried a variety of anchor experiments before settling on Schieffer before settling down with Katie.
But two statements stand out as signs of Couric clearly bustin’ a death cap in Perky’s ass, and her probably growing frustration with her program’s inability to gain ground on the competition.
La Couric was quoted in James Wolcott’s column in the new issue of Vanity Fair, saying "[w]e kind of ignore people who are observing everything we do and praising, criticizing or analyzing it." That rather imperial diss arrived about the same time that Radar Online circulated excerpts of an interview Couric had with Tom Junod of Esquire magazine. You can almost see Katie’s canines emerging as she speaks, barely disguising exasperation. “You guys even take a shot at me. You have something in the November issue, something about how since I’ve become an anchor, you don’t know me anymore. You don’t know me anymore? Bite me.”
Katie Couric thus joins … [drumroll please] ...
THE PANTHEON OF CBS NEWS ANCHORS
WITH LEGENDARY QUOTES
Murrow: “Good night and good luck.”
Cronkite: “And that’s the way it is.”
Couric: “Bite me.”
Like any worthy professional in a position where results are evaluated on a week-to-week basis, Couric’s eager to make her mark sooner rather than later. But her sniping at Esquire magazine does more than reveal a petulant streak; it suggests that she doesn’t understand the nature of television. A medium that provides more or less immediate information can be expected to generate more or less immediate reactions to that information, and how it’s presented to people.
We’d be inclined to write it off to sophomore jitters if it were anyone other than Couric, an old hand in the medium, especially its’ morning talk-show wars. But with statements like that, she’s biting the medium that feeds her, and the public that feeds that medium. There’s a sense that she thinks the “CBS Evening News” is supposed to be immune from adverse reaction. If she really feels like that … maybe we don’t know her anymore.
Couric’s testiness is coming, ironically enough, when she’s finding the sweet spot as a TV journalist. Maybe now she’ll dial back her expectations of herself, just a little. When you’ve set those expectations so high, having essentially said you intend to reshape the known model of evening broadcast news, you’re off your meds if you don’t think viewers will react, and fast, favorably or otherwise.
The opening-day hoopla is over, Katie; now it’s time to settle down on the mound and pitch. Don’t forget, the audience is always the umpire. Always. And you’ve been in television long enough to know that.
Monday, December 11, 2006
Cut and [add verb here]
When President Bush appeared at the White House in a joint appearance with Vice President Cheney and Secretary of State Condoleezza Rice, Bush stood and began to outline his “new way forward” for the United States in Iraq. As he spoke, the President of the United States had the expression of a man with the fear; his eyes were set in the expression of someone stuck in a moment he can’t get out of, quagmired in something he didn't anticipate and can't conjure a way out of.
On Monday President Bush finally, fully looked like that which he has been for some time: a man with few options on the situation in Iraq. The way he looks suggests there’s been a hunt for wiggle room as events unfold in the killing grounds of Baghdad and the volatile Anbar province.
The president and some in his inner circle have in the days since the release of the Iraq Study Group made a big show of deliberation over its findings. The administration has given lip service, at the very least, to the kind of thoughtful consideration and consultation with others that should have been a part of the White House’s handling of the crisis all along.
President Bush is gearing up to deliver a Major Statement on the war sometime between now and Christmas. We can only guess at its content, but all the digesting and chin-pulling now underway can’t disguise the need for something that is a true departure from the policies of the past. This time he’s going to have to do more than repackage old goods in pretty paper for the holidays. Wrapping the old intransigence inside a new slogan won’t do. Some in his party are announcing their discontent. Sen.Gordon Smith is the latest in the GOP to break ranks with the administration’s stand on Iraq.
The method and style of our departure can be described all kinds of ways. But the bottom-line reality will be the same. Whether it’s cut and run or cut and walk, turn, shift, stroll, saunter, sashay, glide, flee, slide, creep, drift, levitate, ambulate, tiptoe, sprint, astral-project, vamoose, depart or dash, the working reality is, or should be, the same:
Within a time frame that needs to be defined and adhered to regardless of the politics on the ground in either Baghdad or Washington, it means to cut and leave – to realize the gravity of a national error and to fully undertake its correction by removing its proximate cause: our military presence in the country, as destabilizing as it is constructive.
It means using the leverage of that time frame to concentrate the mind of the Iraqi government on doing what must be done to remain a government.
It means understanding that there’s no way of knowing if Iraq can work as a democracy until the enforcer’s shadow is out of the picture.
It means that when we leave, whenever we leave, the contours of the Iraqi government will reshape themselves to suit the traditions and values and cultures of the people of Iraq, whether we like it or not.
It means that neither the United States, nor any other member of the phantom coalition, can be the indefinite guarantor of ideals the Iraqi people fail to embrace of their own free will.
It means recognizing and accepting that there’s absolutely nothing we can do to prevent “our enemies,” in the broadest and most imprecise sense of the word, from characterizing our military departure from Iraq as a victory, whenever it happens.
But it also means having grown as a nation, to ourselves and to our allies, by having experienced both the tragedy of error and the power of will to mitigate the impact of that error, and to fix as much of it as possible.
It means beginning to put this nation on a road to a real reconciliation -- red states with blue states, hawks with doves, the Sunnis and Shiites of America -- of one of the most profound differences between us as Americans.
It means a start to restoring America’s place in the world, regaining the moral high ground and existential pride of place that makes this nation what it is.
It means taking some solace in knowing that by altering the present course, it saves us from the scale of personal agony and national woe that traumatized this nation in the years of Vietnam.
It means nothing more or less than learning from the mistakes of the past.
-----
Image credit: White House (public domain)
On Monday President Bush finally, fully looked like that which he has been for some time: a man with few options on the situation in Iraq. The way he looks suggests there’s been a hunt for wiggle room as events unfold in the killing grounds of Baghdad and the volatile Anbar province.
The president and some in his inner circle have in the days since the release of the Iraq Study Group made a big show of deliberation over its findings. The administration has given lip service, at the very least, to the kind of thoughtful consideration and consultation with others that should have been a part of the White House’s handling of the crisis all along.
President Bush is gearing up to deliver a Major Statement on the war sometime between now and Christmas. We can only guess at its content, but all the digesting and chin-pulling now underway can’t disguise the need for something that is a true departure from the policies of the past. This time he’s going to have to do more than repackage old goods in pretty paper for the holidays. Wrapping the old intransigence inside a new slogan won’t do. Some in his party are announcing their discontent. Sen.Gordon Smith is the latest in the GOP to break ranks with the administration’s stand on Iraq.
The method and style of our departure can be described all kinds of ways. But the bottom-line reality will be the same. Whether it’s cut and run or cut and walk, turn, shift, stroll, saunter, sashay, glide, flee, slide, creep, drift, levitate, ambulate, tiptoe, sprint, astral-project, vamoose, depart or dash, the working reality is, or should be, the same:
Within a time frame that needs to be defined and adhered to regardless of the politics on the ground in either Baghdad or Washington, it means to cut and leave – to realize the gravity of a national error and to fully undertake its correction by removing its proximate cause: our military presence in the country, as destabilizing as it is constructive.
It means using the leverage of that time frame to concentrate the mind of the Iraqi government on doing what must be done to remain a government.
It means understanding that there’s no way of knowing if Iraq can work as a democracy until the enforcer’s shadow is out of the picture.
It means that when we leave, whenever we leave, the contours of the Iraqi government will reshape themselves to suit the traditions and values and cultures of the people of Iraq, whether we like it or not.
It means that neither the United States, nor any other member of the phantom coalition, can be the indefinite guarantor of ideals the Iraqi people fail to embrace of their own free will.
It means recognizing and accepting that there’s absolutely nothing we can do to prevent “our enemies,” in the broadest and most imprecise sense of the word, from characterizing our military departure from Iraq as a victory, whenever it happens.
But it also means having grown as a nation, to ourselves and to our allies, by having experienced both the tragedy of error and the power of will to mitigate the impact of that error, and to fix as much of it as possible.
It means beginning to put this nation on a road to a real reconciliation -- red states with blue states, hawks with doves, the Sunnis and Shiites of America -- of one of the most profound differences between us as Americans.
It means a start to restoring America’s place in the world, regaining the moral high ground and existential pride of place that makes this nation what it is.
It means taking some solace in knowing that by altering the present course, it saves us from the scale of personal agony and national woe that traumatized this nation in the years of Vietnam.
It means nothing more or less than learning from the mistakes of the past.
-----
Image credit: White House (public domain)
Thursday, December 7, 2006
The wake-up bomb
It’s ninety-seven pages long; you can read it in a day (hell, you probably will before the president does). It was released to the world on Wednesday, a physically modest report on the American military presence in Iraq, the origins of that presence and the consequences for staying there. While some of it is a narrative, codified form of what people in the United States have been saying for months ((if not years), all of it represents the biggest challenge to the stasis of the current military effort since the war began.
The Iraq Study Report may be the wake-up call for the war effort, a document that, while breaking no new ground on offering solutions, is effective both because of its brevity and its bipartisan genesis. The group, commissioned by Congress in March, was composed of ten bipartisan members and chaired by James Baker III and Lee Hamilton, two former Capitol Hill operatives as known for their clear-eyed intellect and drive as for their political inclinations. Consequently, it won’t do for the President to dismiss its findings as the partisan rant of the Democrats looking for payback. The power of the report is the simplicity of its arguments and the variety of voices, from both sides of the aisle, roused to make them.
For some, and despite the sense of fellow feeling the report is meant to convey, criticism of the Republican war effort is implicit in its conclusions. Sen. Barack Obama of Illinois, interviewed Wednesday by CNN’s Anderson Cooper, said “it would be hard not to see in this report a rebuke of an ideologically-driven strategy that has been blind to what’s [been] happening on the ground for the last several years.”
Even before the thing came out, the White House and its proxies were spinning furiously, downplaying the impact of something they hadn’t seen yet. Gen. Peter Pace tried to have it both ways when asked to characterize the progress of American forces in Iraq. “We’re not winning, but we’re not losing,” the general said, uttering words that we never thought we’d hear coming from one of the Joint Chiefs of Staff.
Hell, there’s not a military man in the last hundred years, from George S. Patton to Beetle Bailey, that doesn’t know the fundamental rule of American military might: it is not defined by attrition. Standing still is contrary to the military dynamic – and especially the American military dynamic. With the instincts, power and stealth of a shark, the American military machine is defined by forward action. In the classic zero-sum gain scenario, if the U.S. military isn’t winning, it is losing. And we’ve come to understand, over the last three years and nine months, that the Iraq war doesn’t respond to the convenient polarities of “winning” and “losing” anyway, even if the Cold Warriors still do.
For Obama, and no doubt for others, the report is a step toward distilling that kind of verbiage and administration spin into something coherent and accurate. “For the first time, what we’re seeing is a bipartisan agreement about facts on the ground,” he said.
The big question is what President Bush will do with it. Early signs are not promising. Today a testy Bush did a joint standup with British Prime Minister Tony Blair, himself a lame duck as leader of the UK, and essentially repeated his longstanding contention that staying the course is the only viable strategy.
At the root of the president’s persistent calculus is the need to keep American forces in place in Iraq until a viable government can establish rule of law over the entire country. But that reasoning in pursuit of a Middle Eastern democracy that can, in Bush’s words, “govern itself, sustain itself and defend itself” may be ultimately self-defeating.
The truest test of a democracy is what happens when you take away the guns needed to start one. Sooner or later, the democratic ideal can’t be enforced at gunpoint; a democratic government’s got to take root and flourish on its own. Ultimately it stands or falls on nothing more, or less, than the confidence, good will and energies of the people living in it. For that reason, the United States will never know for certain if Iraq is capable of a functional, thriving participatory democracy until we do leave.
The Iraq Study Report, therefore, calls out the Bush administration to set a timetable not so much for our troops’ withdrawal as for the moment for the Iraqi people to decide -- indigenously, on their own terms -- whether they believe in that form of government or not. At the very least, the report is a forthright cry for reality as well as action. In that respect, it’s a valuable document. Let’s hope the White House sees that.
-----
Image credit: The White House (public domain)
The Iraq Study Report may be the wake-up call for the war effort, a document that, while breaking no new ground on offering solutions, is effective both because of its brevity and its bipartisan genesis. The group, commissioned by Congress in March, was composed of ten bipartisan members and chaired by James Baker III and Lee Hamilton, two former Capitol Hill operatives as known for their clear-eyed intellect and drive as for their political inclinations. Consequently, it won’t do for the President to dismiss its findings as the partisan rant of the Democrats looking for payback. The power of the report is the simplicity of its arguments and the variety of voices, from both sides of the aisle, roused to make them.
For some, and despite the sense of fellow feeling the report is meant to convey, criticism of the Republican war effort is implicit in its conclusions. Sen. Barack Obama of Illinois, interviewed Wednesday by CNN’s Anderson Cooper, said “it would be hard not to see in this report a rebuke of an ideologically-driven strategy that has been blind to what’s [been] happening on the ground for the last several years.”
Even before the thing came out, the White House and its proxies were spinning furiously, downplaying the impact of something they hadn’t seen yet. Gen. Peter Pace tried to have it both ways when asked to characterize the progress of American forces in Iraq. “We’re not winning, but we’re not losing,” the general said, uttering words that we never thought we’d hear coming from one of the Joint Chiefs of Staff.
Hell, there’s not a military man in the last hundred years, from George S. Patton to Beetle Bailey, that doesn’t know the fundamental rule of American military might: it is not defined by attrition. Standing still is contrary to the military dynamic – and especially the American military dynamic. With the instincts, power and stealth of a shark, the American military machine is defined by forward action. In the classic zero-sum gain scenario, if the U.S. military isn’t winning, it is losing. And we’ve come to understand, over the last three years and nine months, that the Iraq war doesn’t respond to the convenient polarities of “winning” and “losing” anyway, even if the Cold Warriors still do.
For Obama, and no doubt for others, the report is a step toward distilling that kind of verbiage and administration spin into something coherent and accurate. “For the first time, what we’re seeing is a bipartisan agreement about facts on the ground,” he said.
The big question is what President Bush will do with it. Early signs are not promising. Today a testy Bush did a joint standup with British Prime Minister Tony Blair, himself a lame duck as leader of the UK, and essentially repeated his longstanding contention that staying the course is the only viable strategy.
At the root of the president’s persistent calculus is the need to keep American forces in place in Iraq until a viable government can establish rule of law over the entire country. But that reasoning in pursuit of a Middle Eastern democracy that can, in Bush’s words, “govern itself, sustain itself and defend itself” may be ultimately self-defeating.
The truest test of a democracy is what happens when you take away the guns needed to start one. Sooner or later, the democratic ideal can’t be enforced at gunpoint; a democratic government’s got to take root and flourish on its own. Ultimately it stands or falls on nothing more, or less, than the confidence, good will and energies of the people living in it. For that reason, the United States will never know for certain if Iraq is capable of a functional, thriving participatory democracy until we do leave.
The Iraq Study Report, therefore, calls out the Bush administration to set a timetable not so much for our troops’ withdrawal as for the moment for the Iraqi people to decide -- indigenously, on their own terms -- whether they believe in that form of government or not. At the very least, the report is a forthright cry for reality as well as action. In that respect, it’s a valuable document. Let’s hope the White House sees that.
-----
Image credit: The White House (public domain)
Tuesday, December 5, 2006
Goodwrench for President?!
On Wednesday the bipartisan Iraq Study Group will issue its long-awaited report on the situation in Iraq. It’s a document whose impact is widely anticipated to call for a change, however predictable to our adversaries and to us, in the United States’ military policy of engagement in Iraq. It’s a sign of the report’s anticipated impact that TV news anchors announced plans to broadcast live from Washington – not New York, where they’re all headquartered.
“Live, from the Congressional Printing Office -- ” We’ll find out if that pans out.
But whatever the report says, whatever its recommendations, it will be quietly seen as another thoughtfully momentous expression of the skills and smarts of one of the Group’s prime movers, a leader in the panel that is also known as the Baker Commission, named for James Baker III, the one man who should impart a deep and abiding fear in the Democratic leadership contemplating 2008.
We’ve been treated to a laundry list of likelies for the 2008 Republican nomination. There’s Rudy Giuliani, the abrasive but passionate America’s Mayor still riding a wave of favorable sentiment “in the wake of” Sept. 11. There’s John McCain, a longtime favorite of moderates in both parties and a party man who may be due to cash in on enduring mainstream popularity. Sam Brownback is weighing his options; so is Arkansas Gov. Mike Huckabee, and Massachusetts Gov. Mitt Romney, whose bona fides don’t strike party loyalists as being too conservative. There are others too, names dropped as much for good cocktail conversation as anything else.
We propose a possible name to add to the ones above, a list of what at this point can only be called the Usual Suspects. You want to drop a name at a cocktail party? Right or wrong, you read it here first: Go with James Baker for President in 2008.
Serious as a heart attack, Republican strategists might well be saying: In a time when the Party is under siege as much from within as from without; when scandals from carnal to financial have undercut the confidence of the party faithful; when the enemy runs Congress; when minorities are deserting the party, lowering percentages that were weak to start with, there’s no danger at all in riding the right dark horse to victory.
Whether he'd be willing to subject himself to the rigors of a campaign is another matter entirely. But on paper, at least, James Baker is precisely the kind of tough, smart, experienced, pragmatic, multi-hyphenate operator the GOP needs in order to win.
If nothing else, consider the man's resume: Baker led the legal team that beat back the Al Gore challenge in the Hanging Chad Incident of 2000. He was White House Chief of Staff for two administrations, and also served treasury secretary and secretary of state. A fixer par excellence, he helped broker the deal that assembled the first Gulf War coalition under Bush #41.
Then there’s the other factors: Baker’s perceived as a practical centrist, a mechanic at a high level, someone able to wield old connections and a formidable organizational mind to take on challenges critical to the nation’s future.
In a Pittsburgh Post-Gazette cartoon in the latest issue of Newsweek, two elves sit building toys in the runup to Christmas. Another elf, James Baker, stands elsewhere, working on disjointed contraptions in a box marked “IRAQ WAR.”
“Who’s the new elf?” one elf asks his co-worker.
“Santa calls him in when something’s really broken,” the second elf says.
There’s already a quiet undertone of mystery about the 2008 contest, a sense that a dark horse could come on strong from out of nowhere and win the nomination, and the presidency. Right now, the 2008 election is characterized more by what we don’t know than what we do know.
Because of that, Baker 2008 isn’t out of left field at all. A candidate with, uh, unimpeachable White House bona fides and gravitas in the private sector could be the nightmare that wakes Democrats up in the dead of night, drives them into party offices, stumbling over the furniture in the scramble to the white boards and phone banks.
Why? Because Baker’s electable – maybe more then Republicans can realize or Democrats will admit.
-----
Image credits: U.S. State Department and the White House (public domain)
“Live, from the Congressional Printing Office -- ” We’ll find out if that pans out.
But whatever the report says, whatever its recommendations, it will be quietly seen as another thoughtfully momentous expression of the skills and smarts of one of the Group’s prime movers, a leader in the panel that is also known as the Baker Commission, named for James Baker III, the one man who should impart a deep and abiding fear in the Democratic leadership contemplating 2008.
We’ve been treated to a laundry list of likelies for the 2008 Republican nomination. There’s Rudy Giuliani, the abrasive but passionate America’s Mayor still riding a wave of favorable sentiment “in the wake of” Sept. 11. There’s John McCain, a longtime favorite of moderates in both parties and a party man who may be due to cash in on enduring mainstream popularity. Sam Brownback is weighing his options; so is Arkansas Gov. Mike Huckabee, and Massachusetts Gov. Mitt Romney, whose bona fides don’t strike party loyalists as being too conservative. There are others too, names dropped as much for good cocktail conversation as anything else.
We propose a possible name to add to the ones above, a list of what at this point can only be called the Usual Suspects. You want to drop a name at a cocktail party? Right or wrong, you read it here first: Go with James Baker for President in 2008.
Serious as a heart attack, Republican strategists might well be saying: In a time when the Party is under siege as much from within as from without; when scandals from carnal to financial have undercut the confidence of the party faithful; when the enemy runs Congress; when minorities are deserting the party, lowering percentages that were weak to start with, there’s no danger at all in riding the right dark horse to victory.
Whether he'd be willing to subject himself to the rigors of a campaign is another matter entirely. But on paper, at least, James Baker is precisely the kind of tough, smart, experienced, pragmatic, multi-hyphenate operator the GOP needs in order to win.
If nothing else, consider the man's resume: Baker led the legal team that beat back the Al Gore challenge in the Hanging Chad Incident of 2000. He was White House Chief of Staff for two administrations, and also served treasury secretary and secretary of state. A fixer par excellence, he helped broker the deal that assembled the first Gulf War coalition under Bush #41.
Then there’s the other factors: Baker’s perceived as a practical centrist, a mechanic at a high level, someone able to wield old connections and a formidable organizational mind to take on challenges critical to the nation’s future.
In a Pittsburgh Post-Gazette cartoon in the latest issue of Newsweek, two elves sit building toys in the runup to Christmas. Another elf, James Baker, stands elsewhere, working on disjointed contraptions in a box marked “IRAQ WAR.”
“Who’s the new elf?” one elf asks his co-worker.
“Santa calls him in when something’s really broken,” the second elf says.
There’s already a quiet undertone of mystery about the 2008 contest, a sense that a dark horse could come on strong from out of nowhere and win the nomination, and the presidency. Right now, the 2008 election is characterized more by what we don’t know than what we do know.
Because of that, Baker 2008 isn’t out of left field at all. A candidate with, uh, unimpeachable White House bona fides and gravitas in the private sector could be the nightmare that wakes Democrats up in the dead of night, drives them into party offices, stumbling over the furniture in the scramble to the white boards and phone banks.
Why? Because Baker’s electable – maybe more then Republicans can realize or Democrats will admit.
-----
Image credits: U.S. State Department and the White House (public domain)
Bolton bolts
As anyone could have predicted the day after the election on Nov. 7, John R. Bolton is on his way out as U.N. ambassador. Reading the handwriting on the wall (written in 400-point type after the Democrats swept Congress a month ago), the combative, abrasive ambassador announced that he would step down when his recess appointment ends, at the end of the current session of Congress.
The long knives on Capitol Hill and on the East River can now rest, knowing that Bolton, a lightning rod for admninistration policies related to Iraq specifically and our relations with other nations generally, will shortly exit the scene.
Newsweek columnist Michael Hirsh understood the relationship Bolton had, and didn't have, with the U.N.: "A brilliant Yale-educated lawyer, Bolton could be sharp and effective negotiating resolutions inside the Security Council. But by temperament and philosophy he had little use for the organization he worked in. Bolton simply never believed that the U.N., or at least large parts of it, had a right to exist.
"Indeed Bolton was so objectionable a presence, even to stalwart allies like Britain, that on a number of votes on which U.S.-friendly countries used to abstain—like anti-Israel resolutions, small arms treaties or family planning programs—they often voted against Washington. During his almost 18 months in office, a once-mild caucus of developing countries inside the U.N. called the G-77 came to be a strong unified voice against Bolton—and against America."
This is the great -- but not insurmountable -- challenge facing the next American envoy to the U.N.: correcting the damage done over the past 18 months by an ambassador of a nation at odds with the world his ambassadorship says he's a part of.
Quiet as kept, one of the main reasons that Bolton gained no real traction at the U.N. was illustrated in the very fact of what it took to get him into the job in the first place. Bolton was a recess appointment, one of those named to cabinet posts under a loophole in the process of congressional oversight on matters of weighty appointments. Basically that loophole gives the president the leverage to appoint someone to a position while Congress is in recess, bypassing the usual vetting and confirmation process.
It's all legal and above board; there are no constitutional crises that arise when the president decides to make a recess appointment. But at the end of the day, it just looks bad when a president has to do an end-run around another branch of government to nail down a hire he wants. That's all. It seems petty and dictatorial. It just ... looks bad.
And one suspects that other diplomats around the world won't be swayed into thinking that what they'd observed on Capitol Hill and the White House over the previous year was just a bureaucratic foulup, a technical fact of life, like the Brits debating whether or not to wear wigs in the House of Lords. They'll rightly see the Bolton appointment for what it is: something bigger, something deeper than cosmetics, nothing less than a fight for some aspect of the soul of the country.
The next ambassador will be vetted and confirmed, or not, by a majority Democratic Congress, one presumably enlightened by the idea of finding a conciliator to work the halls and podia of the one globally-accepted forum for conciliation.
The challenge for the Congress is to approve someone ready to rise above political agenda, to embrace the United Nations rather than bridle at the deliberations that make the U.N. what it is. The challenge for President Bush is to nominate soneone who'll do that.
The long knives on Capitol Hill and on the East River can now rest, knowing that Bolton, a lightning rod for admninistration policies related to Iraq specifically and our relations with other nations generally, will shortly exit the scene.
Newsweek columnist Michael Hirsh understood the relationship Bolton had, and didn't have, with the U.N.: "A brilliant Yale-educated lawyer, Bolton could be sharp and effective negotiating resolutions inside the Security Council. But by temperament and philosophy he had little use for the organization he worked in. Bolton simply never believed that the U.N., or at least large parts of it, had a right to exist.
"Indeed Bolton was so objectionable a presence, even to stalwart allies like Britain, that on a number of votes on which U.S.-friendly countries used to abstain—like anti-Israel resolutions, small arms treaties or family planning programs—they often voted against Washington. During his almost 18 months in office, a once-mild caucus of developing countries inside the U.N. called the G-77 came to be a strong unified voice against Bolton—and against America."
This is the great -- but not insurmountable -- challenge facing the next American envoy to the U.N.: correcting the damage done over the past 18 months by an ambassador of a nation at odds with the world his ambassadorship says he's a part of.
Quiet as kept, one of the main reasons that Bolton gained no real traction at the U.N. was illustrated in the very fact of what it took to get him into the job in the first place. Bolton was a recess appointment, one of those named to cabinet posts under a loophole in the process of congressional oversight on matters of weighty appointments. Basically that loophole gives the president the leverage to appoint someone to a position while Congress is in recess, bypassing the usual vetting and confirmation process.
It's all legal and above board; there are no constitutional crises that arise when the president decides to make a recess appointment. But at the end of the day, it just looks bad when a president has to do an end-run around another branch of government to nail down a hire he wants. That's all. It seems petty and dictatorial. It just ... looks bad.
And one suspects that other diplomats around the world won't be swayed into thinking that what they'd observed on Capitol Hill and the White House over the previous year was just a bureaucratic foulup, a technical fact of life, like the Brits debating whether or not to wear wigs in the House of Lords. They'll rightly see the Bolton appointment for what it is: something bigger, something deeper than cosmetics, nothing less than a fight for some aspect of the soul of the country.
The next ambassador will be vetted and confirmed, or not, by a majority Democratic Congress, one presumably enlightened by the idea of finding a conciliator to work the halls and podia of the one globally-accepted forum for conciliation.
The challenge for the Congress is to approve someone ready to rise above political agenda, to embrace the United Nations rather than bridle at the deliberations that make the U.N. what it is. The challenge for President Bush is to nominate soneone who'll do that.
Subscribe to:
Posts (Atom)