Romney 2016

For a land of supposedly endless second chances, it’s striking to notice the viciously unforgiving nature of one of America’s most entrenched political traditions: if you lead your political party to defeat in a presidential election, you never get another try.

Slow news summer that it is, one of the manufactured controversies of the moment is whether Mitt Romney could rise from the ashes of his failed 2012 presidential run to emerge a credible contender in 2016. Former advisors and donors seem to be doing their best to fan the flames, writing editorials, organizing petitions, and leaking choice quotes to anyone that will listen that their guy still has one more fight in him, if only we’d give him a push.

Sounds great to me. Regardless of who deserves the blame, much of what Romney predicted a second term of Obama would bring has been brought, from the post-pullout collapse of Iraqi security to ongoing Obamacare woes to no-end-in-sight dithering on the Keystone pipeline. With 53% of Americans said to want a do-over of 2012, simply announcing “you still can” seems like a pretty compelling pitch at this point. As Allahpundit observes, the I-told-you-so campaign ads “write themselves.”

Yet aside from the man’s personal distaste for another long, expensive slog through the campaign muck (a slog, as the documentary Mitt vividly illustrates, the candidate himself wasn’t even particularly giddy about the last go-round), the lead obstacle to such a supremely rational Republican strategy seems to be American political culture’s entrenched stigmatization of failed presidential nominees.

From where this tradition emerged is hard to say. American history is rife with presidents seeking non-consecutive terms after a loss four years earlier, including Martin Van Buren, Teddy Roosevelt, and Grover Cleveland — who actually won one. In more recent times, we all know the story of Richard Nixon losing to JFK in 1960 only to win renomination — and the White House — in 1968. And obviously there’s scant taboo against seeking a party’s presidential nomination more than once; the vast majority of post-war presidents and vice presidents, in fact, have launched at least one unsuccessful primary bid for their party’s nod prior to getting it.

My own theory is that a series of devastating losses for a succession of weak presidential nominees in the late 20th century — Mondale in 84, Dukakis in ’88, Dole in ’96 — helped solidify a trope of the “presidential loser” (portrayed hilariously in this Futurama scene), in which failed candidates are just fundamentally pathetic, hapless characters.

This runs contrary to the political culture of most other western democracies, in which losers can, and do, lead their parties to multiple defeats before eventually eking out a win.

Canada’s Stephen Harper lost his first bid for prime minister in the country’s 2004 general election, but retained his party’s support to make a second go in 2006, where he won. The new conservative prime minister of Australia, Tony Abbott, similarly lost his first bid for power in 2010 before winning in 2013. Ditto for Israel’s Benjamin Netanyahu, who was his party’s candidate for prime minister five times, but only won three.

The advantage of multiple runs is obvious: a familiar face means less time is spent campaigning on biography and resume — the dreaded “introducing yourself to voters” — and more time on issues. Voters might be sick of you, sure, but that fatigue is not without strategic benefit: having “heard it all before” applies to insults as well as slogans. Just as criticisms about President Obama’s socialism and hidden agendas seemed stale in 2012, so too would the tired tropes of Romney as an out-of-touch aristocrat be pretty boring in 2016. Indeed, Romney could take particular comfort from the fact that many of the overseas conservative leaders cited above were deemed “too right-wing” during their first run only to have that charge seem considerably less frightening after another couple of years with the progressives in charge.

It’s possible the GOP could do better than Romney in 2016, but it’s equally likely they could do a lot worse, too. He’s certainly a man with strategic and ideological flaws worth considering, but the fact that he ran and lost four years ago shouldn’t be one of them.

 

16 Comments; - Discuss on Facebook






Iraq’s Parliamentary Problem

Recent coverage of Iraq’s internal breakdown has focused mostly on the rampaging horror of ISIL, and rightfully so. But the comparatively drier story of the political decay of Prime Minister Nuri al-Maliki is a tale inseparably linked to that same violence —  or at the very least, the American response to it.

In his recent New York Times interview, President Obama specifically linked his restrained bombing campaign of select ISIS targets with a desire to keep Maliki weak and unpopular. I’m not going to use American power to “bail out” a flaling government, he said, noting that the United States will not be a firm ally of any prime minister until they prove they’re “willing and ready to try and maintain a unified Iraqi government that is based on compromise.”

Understanding the inability of the Iraqi political class to fulfill this demand is a story of the failure of Iraq’s parliamentary political institutions.

The post-Saddam Iraqi constitution gave the country a parliamentary system moulded in traditional European fashion. It featured a party-list based electoral system, a figurehead president appointed by parliament, and an executive prime minister selected from among the factions of the legislature.

In 2005, the year of Iraq’s first general election, a formal alliance of Shiite parties, led by Dawa, an Iranian-backed ex-terrorist group, won a strong plurality of seats, and after months of negotiations with Kurdish and Sunni parties — whose votes were needed for an outright majority — Dawa deputy leader Nuri al-Maliki was confirmed as prime minister (the party’s actual boss, Ibrahim al-Jaafari, deemed too religiously dogmatic).

In elections five years later, Maliki’s Shiite coalition narrowly lost its plurality to the secular, pro-western party of longtime Bush administration darling Ayad Allawim. Yet Maliki was able to stay prime minister by forging a parliamentary alliance with a smaller, more extreme Shiite faction led by a clique of fundamentalist clerics including the now long-forgotten Moktada al-Sadr. This was controversial at the time, but it was consistent with the generally understood parliamentary custom that the incumbent PM should get first crack at forming a coalition government post-election — a precedent ultimately upheld by the Iraqi courts.

Though he had originally come to power with multi-denominational backing, the longer Maliki remained in power, the more brazenly sectarian his government became. This was largely a byproduct of his country’s worsening Sunni-Shiite civil war. A life-long Shiite partisan, Maliki had little qualms about using his position as commander-in-chief to deploy grossly disproportionate violence to crush suspected hotbeds of Sunni extremism (emphasis on suspected) or purge suspicious Sunnis from senior positions in the military, intelligence service, bureaucracy, and cabinet.

Those who expected this dark legacy of division, bloodshed, and favoritism to eventually be rejected by voters were shocked when Maliki’s coalition was able to regain its parliamentary plurality during elections held in April of this year. The Obama administration seemed particularly crestfallen.

Yet good news of a sort arrived this weekend when a fresh procedural bombshell was dropped — word came that Iraq’s president had requested Dawa’s deputy leader, Haider al-Abadi, to assume the prime ministership in Maliki’s place.

Under the terms of the Iraqi constitution, this was within the president’s prerogative — like a constitutional monarch, the Iraqi president is supposed to formally summon the leader of parliament’s “largest bloc” to assemble a government, with Article 76(iii) granting him the additional power to nominate someone else if the initial nominee is unable to get things together within 45 days.

But Malaki had not formally passed that deadline. Despite the fact that this most recent election was held over three months ago, the countdown for assembling a government does not begin until election results are ratified and parliament formally appoints a president —  which only happened on July 24. Malaki is also quite indisputably still leader of parliament’s “largest bloc;” members of his coalition have denounced the president’s alternative guy as representing “no one but himself.” Maliki, for his part, has dubbed the whole thing a “coup,” and some are predicting the constitutional standoff may result in a complete collapse of Iraqi political authority at the moment the country needs it most.

It is, of course, naive to blame any country’s political dysfunction entirely on the system of government they use. Yet it’s hard to deny Iraq’s preexisting political problems have likely been exacerbated by the country’s decision to adopt a complex, European-style parliamentary model, with a proportional representation electoral system that incentivizes politicians who appeal to trans-geographic religious identities and an executive branch that produces rulers who owe their power to a mastery of parliamentary maneuvering, rather than broad-based popular approval.

Had Iraq instead chosen to adopt a blunter presidential system — with a strong executive president elected by multiple-round popular vote and a separately-elected parliament from which ministers could be chosen — many of the country’s problems would doubtless still exist, and possibly even some new ones. Yet the fundamental question of who gets to rule the country would have been far less ambiguous and contestable, and the creation of a unity cabinet much faster and easier.

If Iraq’s political authority does completely break down in coming weeks, the temptation will be strong to insist its people were “never ready” for democracy, and declare the experiment failed. Yet democracy comes in many flavors and the taste Iraqis were given was a decidedly acquired one.

Unfortunately it’s probably too late to try another.

 

5 Comments; - Discuss on Facebook






The media mess blame game

There was a clever cartoon in the San Francisco Gate some years ago, drawn by the hilarious (and unjustifiably obscure) Don Asmussen. It depicted a newspaper blaring the timely headline: “MEDIA SHOCKED BY DECLINE OF MEDIA — ‘IS THIS THE END OF MEDIA?’ ASKS MEDIA.”

The slow decay of mainstream journalism into a decrepit, profit-hemorrhaging husk is supposed to be one of the great tragedies of our time, and one that’s supposed to provoke media people to produce no shortage of opinions, theories and — most importantly — blame to fling around. That the media may not offer the most objective analysis of this question seems rarely contemplated.

I recently listened to an episode of the Canadaland podcast — which offers weekly media-asks-media analysis of this country’s crumbling journalism scene — about the fall of a short-lived Toronto weekly known as The Grid. The magazine, we were told, “did everything right” but still flopped financially. We were told this by interviews with writers and editors who used to work there, who of course were thoroughly convinced of their own brilliance and competence. There was zero conversation with anyone representing the public, which was a tad odd, as the magazine’s financial failings were explicitly due to unprofitable advertisements, which presumably indicates at least some trouble with audience engagement. Instead, fingers were pointed at the traditional hazy devils: management, technology, “trends.”

There is a legitimate concern that journalists are creating what Marxist-types would call a class ideology; a collection of defenses for self-interested behavior disguised in the language of morality. The idea that the stories that matter the most are the stories the reporter subculture most enjoys reporting on, for instance. Or that journalistic ethics should be forever defined by whatever standards are being used right now.

Having increasingly little power to justify, these ideological tropes now merely constrain journalists’ ability to accurately diagnose their own plight, and dream up viable cures.

John Oliver’s recent viral rant against “Native Advertising” was revealing. At precisely the time folks like The Grid team are bemoaning an advertising-based revenue model that’s failing to deliver the goods, tastemakers of the Official Ideology are waging a furious propaganda war against incredibly lucrative new techniques.

Native advertising is basically just a 2.0 name for “advertorial” content, or an advertisement that takes the form of semi-disguised written copy. It’s not a terribly new practice, nor is it particularly sinister. Two of Oliver’s most horrified examples were a Buzzfeed article about cleaning technology written by Swiffer and a New York Times piece on women’s prisons by the Orange is the New Black people. Yet such mildness is nevertheless denounced as representing a profound existential threat to all that’s right and principled about the journalist’s craft, making those who collaborate “whores” or worse. (One wonders if there were similar conniptions the first time someone suggested printing advertisements in newspapers at all.)

But if media pride has atrophied media skill at what Orwell dubbed the “constant struggle” of seeing what’s before one’s nose, there appears an equally powerful impulse on the part of consumers to abdicate responsibility as well, through lazy populist righteousness that’s no less ideologically destructive.

My friend Graham wrote a fine essay on Medium the other day lambasting an entitled and hypocritical reader class who constantly demand quality journalism, yet consistently resist purchasing online subscriptions, and indeed, go one step further and install ad-blockers to prevent themselves from even inadvertently providing the necessary revenue to finance this want. Graham chalks this up to brazen cognitive dissonance, but I’d also blame the convenient myth of a biased, superficial media that gets trotted out every so often to justify consumer apathy. This chart by I Fucking Love Science, for instance, which purports to show all the stuff “the media doesn’t cover” is quite obviously straw man nonsense, yet such ritualistic denunciations of a supposedly “celebrity obsessed” press etc., provide a necessary veneer of principle to an otherwise entirely selfish abdication of public responsibility.

As is the case with most troubling societal trends, I’m convinced our current media troubles are mostly cultural in root, and demand cultural solutions.

At the very least, self-flattery will get us nowhere.

16 Comments; - Discuss on Facebook






Impeachment Horror

Impeachment Horror

I recently finished reading Jeff Toobin’s A Vast Conspiracy, an epic 448-page chronicle of the Monica Lewinsky scandal, from its earliest beginnings as an obscure sexual harassment lawsuit in Arkansas to the second-ever impeachment of an American president. My interest was sparked by Monica’s recent and very thoughtful essay in Vanity Fair, which brought her decades-old story back into public conversation. The tale’s only become more timely since, now that talk of presidential impeachment (spurious or not) has reentered the headlines.

It seems the minute a president enters his second term partisan foes begin to chatter about whether he’s impeach-worthy. It’s a sentiment born partially from frustrated resentment (no one likes to lose twice to the same guy), partially from opportunism (the Congressional opposition almost always gains seats during a president’s first term), and partially from the White House itself, for whom rallying against an “impeachment obsessed” opposition can be of great material benefit.

So present rumblings over the possible impeachment of President Obama will probably only get louder in coming months. What lessons can today’s giddy Republicans learn from their predecessors’ failure?

First: have a clear-cut, impeachable offense.

It was never entirely clear why Clinton was being impeached, which allowed accusations it was “all about politics” or “all about sex” to fill the ambiguity.

Republicans furiously believed the Clinton White House was hopelessly corrupt, and Clinton himself embarrassing and immoral, yet they ultimately chose to impeach him for two incredibly narrow, legal offenses: lying to a grand jury about his affair with Monica during his deposition in the Paula Jones harassment suit, and obstructing justice by conspiring with Monica in various ways to ensure her corroborating silence.

Constitutional scholars generally agree that presidents can be impeached for just about anything, with the constitution’s vague criteria of “high crimes and misdemeanors” defined through centuries of English precedent to mean, in the famously glib words of Gerald Ford, “whatever a majority of the House of Representatives considers it to be.” Yet Toobin argues the 1990s heralded an era in which the judicial system “took over the political system” and it became received wisdom that political battles should be fought through lawsuits and litigation rather than traditional constitutional mechanisms. Republicans thus decided to impeach Clinton on the grounds he was a petty criminal, as opposed to simply unfit for office.

Second: have the numbers.

In contrast to the impeachment of Richard Nixon, which enjoyed some semblance of bipartisan support, every Congressional vote in the long slog to remove Bill Clinton was almost perfectly party-line.

This rank partisanship doomed Clinton’s impeachment from the get-go. Since the final vote in the process — the one that actually expels the president from office — requires a two-thirds majority in the Senate, even the GOP’s healthy majority in both chambers was not sufficient. Some Democrats had to get on board, but because Clinton’s impeachment was perceived as a hysterically ideological Republican plot (a “vast conspiracy,” if you will), none ever did. This was a direct byproduct of problem number one; because the formal argument for impeachment was confused and weak, it remained powerfully unpersuasive to the other side.

Third: have public support.

Perhaps the most famous factoid of the Clinton impeachment is that the President’s approval numbers actually went up during it. Such sympathy appears even more justified in retrospect; the “peace and prosperity” of the 90s remains enviable, and Clinton’s competence as a administrator, whatever his faults as a man, contrasts sharply with his successors.

Had the Republicans upheld the Founders’ intent, and sought to remove Clinton on the subjective, but entirely legitimate grounds that he was too crooked, unethical, and undignified to be president — as embodied not just by the Monica affair, but Whitewater, Travelgate, the Lincoln Bedroom and whatever else — it’s possible their crusade would have seemed a tad more reasonable. But it would have still failed anyway, simply because the American public did not share this conclusion, and Congress knew it.

President Obama is vastly less popular than Clinton, with large percentages believing he’s behaved improperly in a number of high-profile situations. Yet support for impeaching him sits at a dismal 33%, with estimates suggesting backers are around 90% Republican. And of course even in their best-case 2015 scenario, no one thinks the GOP will be holding two-thirds of the Senate any time soon.

The lasting legacy of the Clinton impeachment was the delegitimization of impeachment in general, and to the extent the episode was a gigantic waste of time perhaps that’s fair. Yet at its core, impeachment is simply a constitutional device for removing an unacceptable ruler, so it’s hard to argue the democratic interest is well-served by perpetuating this cultural stigma.

Even if the answer is no, it remains a proposition worth occasionally proposing.

9 Comments; - Discuss on Facebook - Discuss on the Forums (1)






Fresh battle lines being drawn in America’s culture wars

The culture war is dead. Long live the culture war.

It’s fashionable to observe that many of the most contentious social policy cleavages of the 1980s — when America’s “culture war” meme first went mainstream — are now the stuff of broad consensus.

Debates on the appropriate presence of public prayer have concluded in the minimalists’ favor. Universal legalization of same-sex marriage is perhaps a year away. Conservatives made peace with unwed motherhood to double-down on abortion  and paid for the strategic blunder — a large majority remains in favor of keeping the procedure legal in “all” or “certain” circumstances.

Yet as the previous decades’ debates wind down, fresh moral quandaries about the standards and values of American life emerge. Finding harmonious answers to these new dilemmas of tolerance, identity, and individualism will be a defining struggle of the millennial generation.

For as long as we’ve been fretting about discrimination it’s been said that one man’s innocent quip is another man’s slur. As overt bigotry becomes increasingly rare, it’s the innocent quips that now receive the hottest fire, zealousness having not moderated in conjunction with the lowering of stakes. Today’s tolerance activists speak of “microaggressions,” small indignities of language and manners, such as asking a visible minority where he “came from” or suggesting a woman carry the lighter box. Movies and television shows are now meticulously scrutinized using standards like the “Bechdel Test” to ensure females and minorities are portrayed in the most flattering, self-actualized fashion, with an equally fussy eye cast towards nouveau sins like “exoticism” and “othering.” Even perfectly post-modern public figures like Stephen Colbert and RuPaul have proven but a mere ill-utterance away from triggering shame campaigns from a self-appointed vanguard of “what’s not okay.”

This goal of a zero-tolerance culture is invariably at odds with unsuppressed freedom of expression and that which it produces: honest commentary, diverse storytelling, insightful humour, complexity of language and thought. The critics seek to restore the legitimacy of censorship, or at least self-censorship, in which an extra layer of nervous second-guessing must be applied to all intellectual and creative output in order to make ideas subordinate to a ruling ideology capable of punishing dissent.

Struggles over acceptable means of restraining intolerance are in many ways the outgrowth of a larger philosophical split over the nature of identity, and the privileges a citizen should claim through self-applied labels and group affiliation.

Transgender Americans have found success lobbying for legal inclusion as a class protected from open discrimination. Yet anxieties remain over the community’s existential thesis that gender itself is inherently fluid and subjective — an assertion for which the science is hardly settled, yet was recently decreed official fact by the school board of Vancouver. Today we have autistics who oppose efforts to cure their condition on the basis they were merely born different, not ill, a resistance to the “medicalization” of identity shared by an increasing assortment of unpopular demographics, including the obese (who dismiss BMI measurements as quackery), serious drug users (who euphemise their “abuse” as simply “misuse”), and schizophrenics (who self-identify as “multiples”).

Homosexuality was of course once considered a mental illness. That’s no longer the case today, yet whatever biological variables do explain the phenomenon remain ambiguous, and many are not eager to see things clarified. A faction that values the cultivation and preservation of diverse identities and stigmatizes efforts to assimilate or “fix” deviant behavior is destined to clash with those seeking definitive scientific explanations for life’s mysteries.

A similar cleavage divides those who fetishize the total supremacy of the individual against those who worry about behavior’s societal consequences. A friend of mine recently wrote an essay fretting about the sadomasochism renaissance prompted by Fifty Shades of Grey; he worried that a culture elevating individual “consent” to its highest good will be one thoughtlessly normalizing behavior that’s socially destructive in a broader sense — in this case, violence against women. Increasingly loud proposals to normalize other historic taboos — recreational drug use, prostitution, violent video games, child pornography — spawn similar concern. The anti-individualists ask at what point the pursuit of “harmless” personal pleasure corrupts the virtues of the larger society these individuals comprise. The individualist-supremacists flatly deny the possibility.

None of these tensions are particularly new, but the battlegrounds are decidedly 21st century. As opinions congeal around fresh struggles to balance choice, evidence, identity, and opinion, old understandings of the political divide are overthrown. They pose a particular threat to the progressive left, which may be doomed to split into warring post-modern and libertarian factions.

Divisive, ideological, and often personally threatening, the culture wars of the future will not be pleasant. But few wars are.

26 Comments; - Discuss on Facebook






Lack of Pride

“When is Pride this year?” straight friends ask, voices raising with equal parts excitement and condescension. As a gay, I’m presumed to be well-versed in such things, but alas, there’s no easy answer. There exists no single “Pride,” after all, simply a long sequence of independent extravaganzas across North America, each observed on conveniently different dates.

Los Angeles Pride happened long ago for instance, on the traditional first weekend in June. San Francisco pride — the big one — has come and gone as well, occurring, as it does, on the last weekend of that same month. Vancouver Pride kicks off this Saturday, and Vegas Pride comes a month after that, in early September. Countless other cities are sprinkled somewhere in-between.

Such cleverly staggered scheduling, which allows the don’t-stop-the-music set to engage in summer-long “Pride tours” across the continent, hopefully helps illustrate the fundamental vacuousness of this would-be holiday. It’s certainly one of many variables justifying my profound disinterest in it. Despite being gay for as long as I can recall, I not only shun Pride, I actively resent the implication that attendance offers any meaningful indication of one’s GLBT acceptance.

I’ve always felt a bit of sympathy for Rob Ford’s various mumbled explanations of why he’s never attended Toronto Pride during his four years as mayor. His no-show status has of course been widely taken as proof of his supposed homophobia, but his official excuse — that he’s simply an old-fashioned guy who finds the garish flaunting of sexuality uncomfortable — seems perfectly reasonable. To modern elite opinion-makers, however, who have done so much to inflate Pride as the culture’s leading litmus test of tolerance, personal uneasiness is a sentiment so exotic it may as well be uttered in Swahili.

While I’m no prude — actually, strike that, I am a prude. And what of it? Flipping through online albums of Toronto Pride 2013, one finds ample documentation of S&M bondage couples, barely-there thongs, buttless chaps, and all manner of grinding, thrusting, jiggling, and twerking. It’s perfectly acceptable to find such things gross or distasteful, and an exploitive cheapening of both sex and the body.

It is no great character flaw to value modesty or dignity, nor is it bigoted to esteem forbearance and control. Libertine attitudes towards sex, nudity, fetishism, and exhibitionism are issues entirely disconnected from the civil rights matter of whether peoples of divergent sexual orientations are deserving of the same rights and protections of those in the majority. To argue the contrary is to claim possessing a minority sexual preference should be synonymous with sexual deviancy in general — a premise not only dated, but dangerous.

There was a clever Onion piece published more than a decade ago (I doubt such a thing would be written in this more sensitive age) headlined “Local Pride Parade Sets Mainstream Acceptance Of Gays Back 50 Years.”

“I thought the stereotype of homosexuals as hedonistic, sex-crazed deviants was just a destructive myth,” the paper quotes one horrified onlooker. “Boy, oh, boy, was I wrong.”

Sounds about right. Indeed, one has to wonder just how much comfort Pride is even attempting to offer the genuinely sexually conflicted at this point. Considering how much “coming out” anxiety tends to center around fears of lost normalcy, it’s not clear at all how declaring common cause with society’s most brazen display of freakshow non-conformity is a useful means to that end.

Looking at photos of North America’s earliest Pride parades is a window into a different world. The marchers of those days, calmly holding hands with their same-sex partners in sensible polo shirts and penny loafers, were certainly subversive, but only to the extent they were seeking to remind a society in denial of the unavoidability of their existence, and the bland, non-threatening nature of it. Theirs was a call for inclusion in the most literal sense, the welcoming of homosexuals into society’s most central institutions: family, work, religion, politics, and the acceptance of their love as valid as any other sort.

That goal having now largely been achieved, the Pride movement, like so much of the modern Gay Rights activist complex, has become a victim of its own success. As North Americans get used to people being here and queer, the moderate LGBT middle class has drifted away from leadership of the tolerance movement, allowing the wild fringe to fill the void. What results is a historical irony: just as society is most eager to assert its tolerance, Pride redefines the deal. Endorsing the acceptance of ordinary people distinguishable only by what gender they love now demands an additional stamp of approval for all-purpose indecency and licentiousness.

Politicians, corporations, and all manner of interest groups clamor to agree to the terms. But for an increasing lot of gays, it’s hardly obvious why we should care.

14 Comments; - Discuss on Facebook






Indefensible Hamas

Indefensible Hamas
  •  emoticon

There are plenty of perfectly good criticisms to be leveled against the State of Israel. Personally, I’m quite troubled by the so-called “demographic time bomb” theory, which posits that Israel’s increasing Arab and Palestinian birthrates ultimately doom the Jewish nation to embrace some ugly form of minority-rule. And of course we’re all well-versed in the gross spectacle of settler expansion into the West Bank, a brazen effort at colonial growth at exactly the moment the Palestinian territories are supposed to be inching towards independence.

Yet the mere existence of Israeli sin should not blind anyone to the greater evils of its enemies.

This is the sort of blunt moral judgment that’s been traditionally uncouth among fashionable western progressives, who, often feel the need to affect great open-minded exasperation at the Israeli-Palestinian conflict, bemoaning that “fault exists on both sides.” Such is the default position of those ideologically inclined to regard assertive side-taking as a symptom of an unsophisticated mind, with “blind” support of Israel in particular a worrying proxy for some other form of close-minded ignorance  — Millennialist Christianity, perhaps.

Yet in the wake of the current war between the Israeli government and the Islamic Resistance Movement — better known as Hamas — that’s running the Gaza Strip, even the traditional progressive skepticism seems to be breaking down. As Israel’s Palestinian resisters become more nihilistic and radical at precisely the time the Israelis are getting more sensitive and cautious, the lopsided moral imbalance is becoming harder to ignore.

The traditional Israel-bashers are certainly looking more pathetic than usual. The buffoonish United Nations Human Rights Council drew up a monstrously biased report on the Gaza war the other day, which predictably sailed to approval on the votes of the various third world dictatorships who comprise the body’s largest bloc. Yet it was telling no nation resembling a first world democracy could be persuaded to support it. Of the 17 abstentions, almost all noted with concern that the Council’s chronology of the conflict was a bit one-sided, to put it lightly. The brusque four-page report does not include the word “Hamas” once, and instead speaks only of Israeli aggressors inflicting “widespread, systematic and gross violations of international human rights and fundamental freedoms” against the hapless peoples of “Occupied Palestine.”

Nowhere was it mentioned that the Gaza Strip actually ceased to be occupied back in 2005, as the late Ariel Sharon painfully extracted every remaining Jewish settler and soldier from the territory.

Nowhere was it mentioned that Hamas explicitly pledges to “obliterate” the state of Israel in their founding charter — “by Jihad,” in fact.

Nowhere was it mentioned that Hamas leaders have long spoken of “Jews” in the most generic as their enemy, and that their preferred military tactic in the current conflict — lobbing over 2,500 missiles into major population centres — have made urban Israelis the war’s true civilian targets.

Nowhere was it mentioned that Hamas has transported weapons in ambulances, housed missiles in schools, mosques, and hospitals, and disguised their fighters in Israeli uniforms — all clear violations of the codified laws of war.

Nowhere was it mentioned that the Israelis have so far discovered over 30 multi-million dollar “terror tunnels” spiraling out of Gaza (built in part with alleged child labor) that serve no purpose other than to turn western Palestine into a launchpad for guerrilla aggression against its neighbor.

Nowhere was it mentioned that just a few days prior, Hamas refused a comprehensive ceasefire backed by basically everyone who matters: the Egyptian government, the Arab League, the United Nations, the EU — even old man Obama, if anyone still cares about him.

Nor, for that matter, did the report mention the exceedingly cautious conduct of the Israeli forces in what they’re calling “Operation Protective Edge,” a reputation-conscious nervousness so thoroughly unprecedented in modern warfare it’s almost certainly harmed national security.

While Israeli civilians have been largely protected from Hamas rockets by the country’s awesome Iron Dome missile defense system, Palestinian civilians are protected by an Israeli shield of their own: an elaborate system of advanced warnings to residents of Gazan neighborhoods targeted for bombing. The system includes everything from text messages, personalized phone calls, noisemaking “dummy bombs” (so-called “roof knocking”), and even airdropped maps steering civilians to refugee centres. Such has been the IDF’s painstaking effort to mimimize causing casualties while attacking one of the most densely-packed places on earth, yet Hamas has ensured the Palestinian death toll has remained high anyway, glibly encouraging Gazans to dismiss Israeli warnings as “psychological warfare.”

Prime Minister Netanyahu took some flak for noticing that last bit, concluding on American television that Hamas seems to enjoy the existence of “telegenically dead Palestinians.” Yet it’s a indictment that’s difficult to avoid given how effective the conflict’s 570 Gazan victims have proven in forming a narrative of “disproportionate death” — the only argument Hamas can peddle for foreign sympathy. In any case, surely a group cynical enough to engage in talks with North Korea to replenish their depleted missile supply would hardly balk at the indignity of ratcheting up its own body count for propaganda purposes.

A dispassionate analysis of facts like these — facts which are not the result of clever cherry-picking on my end — but simple observation on the broad character of the Gaza conflict to date, cannot help but lead to a simple conclusion: Israel is better than Hamas.

To conclude this isn’t to posit that Israel, and the current Israeli government in particular, is without failing in other contexts, nor to even make a value judgment about the broader merits of Zionism, if you’re still a skeptic. It’s simply to note that what we have right now is a secular, liberal democracy fighting the aggressions of a lunatic death cult who seized power in a military coup and are actively loathed by the long-suffering captives it purports to rule. With tendentious conduct resulting.

Whether that’s an accurate summary of the Palestinian-Israel conflict in general, it’s certainly true of this one.

It demands an appropriate reception.

67 Comments; - Discuss on Facebook - Discuss on the Forums ()






Power Suit

Power Suit
  •  emoticon

What makes the American model of government superior to most others is its elaborate web of checks and balances. Like a mobius strip, the chart of American government depicts three branches each extending an arrow of oversight towards the other two, creating a tightly interlocking network of watchmen being watched. No matter what one branch does, the others always have venues of recourse.

On paper, at least. In practice, alas, not all checks are equally balanced.

While no one disputes the blunt effectiveness of a president vetoing a bill of Congress, the Senate refusing to confirm a judge, or a judge rejecting an unconstitutional decision of the White House or legislature, Congress’ ability to reign in the executive has always proved the most daunting challenge.

A presidential veto can be overridden by Congress, but that requires the two-thirds approval of both chambers, something only possible in the case of legislation boasting enormous, bipartisan popularity, such as the 2008 Medicare funding bill unsuccessfully vetoed by George W. Bush, or President Clinton’s attempt a decade earlier to cancel popular military spending initiatives in a variety of districts held by politicians of both parties. In all, there have been less than 10 overrides in the past 20 years.

Then there’s impeachment, which though actually easier than overriding a veto — requiring, as it does, merely a two-thirds majority in the Senate and a simple majority in the House — has become perhaps the single most stigmatized provision of the US constitution. America’s long tradition of presidential stability has made even contemplating the removal of a president mid-term a taboo of enormous proportions, a fact only further complicated by the legacy of the Clinton years, which established something of a legal-cultural consensus that presidents only deserve to be unseated for serious criminal misdeeds, as opposed to merely moral or political ones.

To be sure, Congress can handicap a president. They can defund his pet projects, as Republicans are always threatening to do with Obamacare, or simply ignore his requests for action, as has been the case with… well, you name it. But as modern presidents have embraced an increasingly maximist understanding of their constitutional powers, the rising challenge for Congress has been the question of how to restrain  a president whose most objectionable decisions are made unilaterally.

Barack Obama has often interpreted his mandate in unusual ways. A common refrain, echoed most recently during his rose garden vow to “fix as much of our immigration system as I can on my own without Congress” is that the need to make policy supersedes the need to respect constitutional procedures for making it.

In the case of immigration, the President is tilling familiar ground. In 2012 he unilaterally declared a two-year amnesty (since extended to four) for the approximately 800,000 illegal immigrants who arrived in America as children. It was a move explicitly intended to compensate for Congress’ failure to pass the so-called Dream Act a year earlier, which promised similar legal relief for America’s inadvertent aliens. Where legislation failed, rule-by-fiat would succeed.

Selective enforcement of the law has likewise been the preferred Obama approach to drug policy. In 2009, Attorney General Holder declared the United States would not enforce federal drug legislation in states that had legalized marijuana for medicinal purposes, and in 2013 he expanded that blind spot to include states that legalized it for recreational use, too. The Justice Department has announced similar plans to stop prosecuting drug offenders when they deem the mandatory punishments excessively harsh. The underlying logic, apparently, is that laws should only be upheld to the extent they serve the President’s ideological ends.

Then there’s Obamacare, whose finer points were all implemented through executive action, most notably the imposition of the everyone-has-to-have-insurance-now deadline (Congress’ law said six months ago; the President says 2016), but also this whole business of forcing employers to cover morning-after birth control that the Supreme Court recently designated an unjust burden on corporate religious freedom.

In response to the administration’s handling of the Obamacare rollout in particular, Speaker John Boehner has announced he plans to sue the White House for unconstitutional behavior, namely a dereliction of the duties mandated by Article II, Section 3: “[The President] shall take Care that the Laws be faithfully executed…” Though what specific redresses the suit will seek have yet to be disclosed, an ideal ruling would presumably compel the administration to begin imposing the Obamacare insurance mandate right away — you know, like the law was supposed to.

Is this wise? The legal establishment seems skeptical. Asking the judicial branch to resolve a conflict between the executive and legislative has little precedent in American history, elevating, as it does, the courts to the status of supreme referee of intergovernmental jurisdictional disputes — itself a proposition of dubious constitutionalism. On the other hand, the more constitutionally orthodox prescription for a Congressional problems with a president — impeachment — seems not only absurdly radical, but politically suicidal. But still, you gotta do something.

President Obama’s Republican predecessor, of course, faced constant abuse of power criticisms of his own, though it’s worth noting that much of the Bush-bashing involved disputes over what is and isn’t within the president’s prerogative as “commander-in-chief,” one of the constitution’s most disputed phrases. In the end, Congressional Democrats elected to do little more than obstruct, complain, and run out the clock — a technique Republicans may ultimately have no choice but to emulate.

Term limits have always been controversial, but they remain the only long-term defense against an executive restrained by little else.

28 Comments; - Discuss on Facebook - Discuss on the Forums (0)






The limits of liberalism

Over the last couple of decades, a dominant narrative of North American politics has been the dangers of drifting too far to the right. From Tim Hudak’s doomed bid for the premiership of Ontario to the surprise defeat of the Wildrose party in Alberta to self-destructive Tea Party campaigns across the United States, the explanation for why so many conservatives can’t get it together appears obvious to most. To paraphrase Margaret Thatcher, right-of-center candidates are placing too much emphasis on the adjective and not enough on the preposition.

Far less contemplated these days is whether there is any negative cost to be incurred from drifting too far to the left, particularly now that progressives increasingly define themselves through boastful acceptance of previously-stigmatized personal behaviour.

The aspiring candidacy of Jodie Emery, Vancouver’s so-called “Princess of Pot” and spouse of recently-released Canadian drug lord Marc Emery may prove a revealing case study.

Mrs. Emery is currently seeking the Liberal nomination in the parliamentary riding of Vancouver East, and she’s hardly hidden the fact that her primary purpose in running is to advocate for the legalization of marijuana, the Emery family’s pet cause. Legalization of marijuana is a position favored by Liberal boss Justin Trudeau, but to suggest the two are on the “same side” of the issue is to betray its moral — and electoral — complexities.

Justin’s stance is essentially a utilitarian one: he sees legalization as a way to battle organized crime and liberate an overburdened criminal justice system. Yet he’s also described consumption of the drug as a “vice” with scant social virtue. His much-ballyhooed admission of prior use was heavily coached, and if not exactly remorseful, was certainly qualified and self-conscious. His legalization plan, though vague, has emphasized the importance of keeping cannabis far from children, and he’s bemoaned that in Harper’s Canada, it’s “easier for youth to access pot than alcohol or cigarettes.”

The Emerys have a slightly different perspective, to put it mildly. As editors of Cannabis Culture magazine, founders of “Pot TV,” proprietors of head shops and seed stores, and MCs of all manner of pot conventions and trade shows, the power couple lead a subculture a that views marijuana not as an unavoidable social sin whose ills must be minimized and controlled with compassionate legislation, but an undeniably positive product with virtues worth celebrating.

“Marijuana is so good! It does so much for so many!” Marc crows in a 2010 YouTube video (ironically devoted to blasting Justin Trudeau as a “f—cking hypocrite” for backing mandatory jail times for drug traffickers). “It’s brought us everything from music to technology to cutting-edge news services, major athletes, every form of entertainment and science and architecture…”

This stance, that pot consumption is harmless, and should be completely destigmatized — if not encouraged — is probably a great deal closer to the views of the Liberal base than their leader’s cautious policy of managing risk without endorsement. Yet Justin’s pragmatism is the result of having enough sense to appreciate that elections are not decided by base alone, but the electorate’s broad centre — those much-coveted middle class, suburban swing voters who remain unfashionably inclined to regard mind-altering substances as a destructive force poisoning the culture of their children and neighborhoods. Considering that Justin’s pot stance is already taking a tremendous drubbing in suburban-oriented Conservative attack ads, it’s unclear if there’s any electoral gain to be had by embracing a darling of the stoner set who lacks even a pretence of pragmatism. In a centralized parliamentary system like ours, it only takes a single rogue candidate to upset a leader’s carefully constructed nuance, and Justin — who’s already shown himself more than capable of torpedoing troublesome candidates — is surely asking himself if Jodie’s a risk worth taking.

A similar dilemma defines the Liberal relationship with legalized prostitution.

Mrs. Emery happily endorsed the idea on Sun News yesterday, declaring it a private, consensual business transaction not terribly different (of course) from the private, consensual business of buying pot. This too, is the sort of proudly permissive position held by much of the Liberals’ ideological base, who prefer to conceptualize the buying and selling of sex as a libertarian thought experiment or abstract goal of sexual liberation. Canadians in the cautious middle, alas, inclined as they are to contemplate problems from a less academic angle, are more likely to fret about whether legalization of prostitution will simply increase its presence in their communities, as legalization of banned things is wont to do.

Though critical of the Harper Government’s John-battling prostitution bill, Trudeau’s party has not embraced the cause of complete legalization, preferring, instead, to hide behind the time-wasting excuse that more research and consideration is needed to reach an informed conclusion. It’s an even more delicate position than his stance on pot, and an even more revealing reflection of his party’s anxieties about being defined by cavalier social policies rather than practical economic ones.

Though it’s easy to dismiss his leadership as entirely frivolous, Justin Trudeau possesses great importance in defining the limits of left-wing social policy at a time when many progressives are inclined to regard “going too far” as the exclusive disorder of the right.

Assuming her candidacy is serious, Jodie Emery will be the canary in the mine.

6 Comments; - Discuss on Facebook






Differences become starker in North American democracy

Since the differences between Canada and the United States are almost all political, we can learn much from the two countries’ recent deviations in the practice of democracy. Just as Canada’s rulers seem to be consolidating their privileges in an increasingly authoritarian parliamentary system, Americans have witnessed a number of inspiring episodes as of late highlighting the comparatively open nature of their republican institutions.

On June 10, Eric Cantor, the Republican House leader, was defeated in a primary election to continue representing his party in Virginia’s 7th district. It was the first time in American history a party leader had lost office in this fashion, and the greatest victory to date of Tea Party insurgents, who had never before unseated a politician of such standing.

Regardless of what one thinks of Cantor, or the right-wing arguments against his credibility as a conservative, the idea that a national party leader could be so easily overthrown simply through populist dissent in his own community says good things about the health of America’s representative democracy. Cantor, it was often said, harbored ambitions of being Speaker of the House someday, yet in the end it was his lack of respect for his present duties as a representative— namely, to represent his community —that ultimately torpedoed his career. He ran an aloof, condescending campaign (most glaringly personified by the fact that he wasn’t even in his state for most of election day) and took it for granted that his status as a national figure insulated him from domestic accountability. And he paid the price.

The opposite was true in Mississippi last week — though only barely. There, six-term Republican senator Thad Cochran kept his party’s loyalty by the narrowest of margins, winning the state GOP renomination by less than half a percent in the June 24 primary. Though his opponent, Tea Party-backed State Senator Chris McDaniel, has proven something of a sore loser, it’s clear Cochran won simply by playing the game better. In a state that’s nearly 40% African-American, Cochran appealed to the liberal sensibilities of black voters by playing up McDaniel’s harsher flavor of conservatism, and unapologetically embracing that which made him so loathed by Tea Party-types in the first place: his talents at ensuring Mississippians always amply benefitted from Washington contracts and subsidies. Or, as both backer and opponent alike were fond of putting it, his ability to “bring home the goodies.”

Neither of these stories would be possible in Canada. In this country, after all, party nominations are not accountable to voters at large, because Canadian political parties are not seen as public utilities within the nation’s democratic system, but privately-owned entities that operate independently within it — and tolerate only the barest minimum of public participation in their internal affairs.

In the United States, one becomes a party member — and thus an eligible primary voter — simply by declaring himself to be one. In Canada, the privilege must be purchased and continually renewed, and can be withdrawn by party elders at any point for misbehaviour. Despite the fact that most Americans are not generally interested in primary elections, over 65,000 Virginians voted in the Cantor race and 300,000 Mississippians in the Cochran one. By contrast, only 100,000 Canadians voted to make Justin Trudeau leader of a national political party (a participation rate of around .4% in a country with 24 million eligible voters). And that was an unprecedented high. Only 1% of Canadians are said to be registered members of political parties, but it’s impossible to know for sure, since the parties tend to be fairly cagey with their membership figures. That same caginess ensures we have no idea how many people are voting in MP nomination races, as Canadian political parties are not by law required to disclose such data to the media or anyone else.

Canadian party elites would no doubt find the fact that Senator Cochran was re-nominated, in part with the support of Democrats (as many of his black voters certainly were) thoroughly ghastly, but “open primary” states like Mississippi, in which voters choose for themselves which primaries they want to vote in — regardless of their party registration — the principle is that politicians are accountable to the voting public as a whole, rather than one narrow faction of it. Democrats have a right to ensure Republicans don’t get too conservative, Republicans have equal right to ensure Democrats don’t get too liberal. If done correctly, the result can be a less polarized, centrist party system in which even in periods of one-party dominance, the opposition can still exert some influence on outcomes.

That both the Cochran and Cantor results were upset shockers similarly highlights the degree of unpredictability in the American system, the very thing Justin Trudeau is currently waging his merry little jihad against as yesterday’s promises of “open nominations” decay into today’s practice of installing preferred candidates light bulb-like across the country. Doubtless much of the GOP national establishment did not want to see Congressman Cantor go down, yet because the American parties have no authoritarian bosses, there was no one available to pull a Trudeau and insulate him with the leader’s stamp of approval, as J-Tru did in anointing Adam Vaughan and Chrystia Freeland, his nominees of choice in successive Toronto by-elections.

Speaking of authoritarian bosses, last week was also notable for the US Supreme Court’s 9-0 smackdown of President Obama’s attempt to run ’round Congress and unilaterally appoint judges and federal board members without first seeking the Senate’s consent, as required by the constitution. Obama’s defense was that Article III gives him the right to install whomever he wants so long as the Senate’s in “recess,” and thus unable to convene to give his picks scrutiny, but as Justice Ginsburg said during arguments, in the age of jet travel “the Senate is always available.” Obtaining Congressional approval might be a slow and painful process, but so long as Congress claims it’s available to sit and consider presidential nominees — even if that availability consists of minute-long perfunctory sessions during breaks that exist only to signal their own availability — a president who claims there’s just no way to get an up-or-down vote for some guy he wants to stick somewhere is either ignorant or dishonest.

It was a fascinating clash of all three branches in which the Court upheld the legislature’s right to scrutinize executive branch appointments as one of the fundamental principles of American democracy — a principle, once again entirely unknown in Canada, where prime ministers just happily install whomever they please. In the US, unqualified Supreme Court judges get vetoed by Congressional consideration hearings. In Canada, they get expelled by the Court itself — but only after being inaugurated and having collected several months pay, as was the case with Justice Nadon. In America, cabinet ministers are expected to be eminently qualified for their positions —because the Senate reads their resumes line-by-line. In Canada, you simply wake up one morning and find Peter MacKay is attorney general for some reason.

Whatever polite pretences Canadians are given for the absence of truly open nomination races and greater scrutiny of prime ministerial appointments — stability, protection from extremism, anti-American contrarianism — the Occam’s razor explanation is clear enough: the elites at the top of the Canadian political pyramid simply don’t want their absolute powers diluted by a lot of fussy checks and balances. Ours is a system in which bottom-up input on important decisions — either from the people’s representatives or the people directly — is a force to be feared and distrusted.

Over the last few weeks, Americans have been reminded that despite its many flaws, their constitutional system is still one that provides considerable safeguards to ensure the little guy can triumph over the big. In Canada, the big guys simply trample — and we’re supposed to be grateful for the privilege.

 

1 Comment - Discuss on Facebook



Archives





  • Recent Posts

  • Cartoon Archives