Ferguson Biases

The Ferguson story is complicated, but its reception is not. We do not know how or why or under what circumstances young Michael Brown was killed by officer Darren Wilson, but we do know, in astonishingly precise detail, how we feel about it.

To say the Ferguson tragedy is being “politicized” isn’t entirely accurate, since (as of now) few figures of high politics have attempted to exploit its heated aftermath for partisan gain— with the possible exception of the ever-scrounging Rand Paul. But most analysis of the episode has certainly been reflective of American political culture, and the various interests and assumptions that increasingly define it.

This includes:

  • The maintenance of perpetually adversarial relations among racial groups through endless stoking of fear, suspicion and grievance;
  • The dismissal of much of established authority as illegitimate, with power that derives solely through force and oppression, usually exercised with extreme prejudice;
  • The inflation of shameful incidents into symptoms of larger “disturbing trends” to shame and frighten the public into a state of moral panic;
  • Conspiracy theories that “the media” works as a unified force to actively spread misinformation and sire hate;
  • Notions that “justice” is not a neutral evidence-based process, but a retributive weapon to be wielded against those we believe to have done wrong.

Such attitudes are the essence of Colbertian “truthiness” — the idea that realities about the world should derive from plausibility, rather than fact — and if left to fester, will rot any civilization in which they’re tolerated. That what happened to Michael Brown is undeniably a tragedy does not excuse doubling-down on destructive ideological clichés any more than it excuses rioting or preposterous police militarization.

America, quite simply, does not have a police-murdering-innocent-people problem. It may well have a disproportionate arrest of African-Americans problem (an at least partial outgrowth of its disproportionate number of crimes committed by African-Americans problem) and even a police brutality problem, but to argue Michael Brown’s death was in any way representative of some greater phenomenon— as many activists are doing — is to ignore precisely what made his killing newsworthy in the first place. Namely, its rarity.

As NPR reported last month, of the over 98 million arrests made in the United States over the last seven years, around .005% of them resulted in what the Justice Department deemed a “homicide.” Even if 100% of all such victims were black, which they obviously are not, it remains an occurrence so freakishly obscure it resists much use as a valuable statistic of larger social trends. Larger social trends, incidentally, reveal massively declining black youth arrest rates.

Likewise, though the #ifiwasgunneddown hashtag has prompted much sympathetic nodding of heads, there’s equally scant evidence the “press” as some collective entity plots to present the public image of black victims of violence in the worst possible light. On the contrary, in the Treyvon Martin case, much greater controversy was had over the media not portraying him thuggishly enough, and instead erring on the side of progressive political correctness by using an outdated, innocuous photo.

It should also be said — and it’s sad that this is even an insightful thing to note these days — no one really knows what happened to Michael Brown. As I write this, the Chief of the Ferguson police department has just released security cam footage of Michael Brown minutes before his death, in which he can be seen stealing a box of cigars from a convenience store and roughing up its clerk. This is of course still miles away from an acceptable justification for killing an unarmed youth, but it does remind that these sorts of stories often involve a lot of moral gray, with facts that refuse to neatly conform to the good vs. evil morality play many desire.

A peaceful society is born from one whose citizens treat each other with tolerance and compassion, and where the disciplinary powers of state authority are exercised with sensitivity and restraint. But so too does peace require a populace whose reality is shaped by observation and fact, not hostile suspicion and pre-determined conclusions.

To cling without proof to assumptions of the most dark and cynical sort is to make ourselves not merely intellectual prisoners of our worst fears and biases, but architects of the very world we claim to want to avoid.

20 Comments; - Discuss on Facebook

Trudeau’s Promised Extremism

Whether a politician can harm himself veering too far to the social left is not a question we’re used to contemplating, destructive extremism on issues like abortion and Islam being traditionally understood as dysfunctions of the right. Yet if Canada’s Conservatives have their way, the dangers of unchecked social progressivism will be one of the defining themes of the country’s 2015 general election, which will pit the square, bourgeois sensibilities of Prime Minister Stephen Harper against the dramatic permissiveness of Liberal leader Justin Trudeau.

Young and peppy, Trudeau’s energetic spontaneity can be an obvious asset, but it’s also resulted in several of his substantial policy positions having their roots in flippant remarks. At a Vancouver barbecue around this time last year, Trudeau offered an impromptu declaration that he was in favor of legalized marijuana, and the statement has been the stuff of endless Tory attack ads ever since.

His opponents’ glee is warranted. A recent Ipsos poll commissioned by the Justice Department revealed Canadians’ views on pot have yet to reach consensus. While legalization remains the plurality preference, with around 37% in favor, 33% back the milder course of merely decriminalizing the drug — essentially the de facto reality today — while about a quarter want things to either explicitly stay the same or for punishments to get harsher.

Such numbers expose a lingering apprehension about the proper place for pot in our society, yet Trudeau’s position is indifferent to subtlety. By the standards of the status quo, it is, in fact, an extreme position, and if Justin is embarrassed that it’s drawing the loud endorsement of other extremists — like flamboyant pot advocates Marc and Jodie Emery — he has only himself to blame. Post-hoc attempts at moderation, couched in assurances that his real marijuana agenda is simply to “regulate and control” a problematic substance — whose use, he is now quick to scold, has actually increased under Harper’s rule —  merely carry an odor of insecurity.

The pot numbers bring to mind a similar recent poll on abortion, which revealed no less public division. 13% believe the procedure should be banned in all or virtually all circumstances, 24% want it very tightly limited, while 52% want no legal restrictions whatsoever. In the face of such nuance, Trudeau again wields a blunt weapon — a complete ban on pro-life candidates within his party. Even in a country as famously anxious about “re-opening the abortion debate” as Canada, the extreme premise that pro-life politicians should simply not exist generates upwards of 70% disapproval. The Harper Tories, whose believe anything/do nothing abortion stance has historically delighted no one, suddenly appear thoughtful and pragmatic.

Nuance was equally absent in Trudeau’s response to last week’s big mosque visit brouhaha, in which Sun News — the conservative network I work for — aired a series of stories noting that in 2011 Justin campaigned at a radical Montreal mosque with al-Qaeda ties.

The Conservatives attacked, but questions were quickly raised. The fact that the mosque had served as a terrorist recruiting post was not widely known until the New York Times broke the story in April of 2011 — a full month after Trudeau’s visit. And even then, it was claimed the mosque’s most vigorous phase of al-Qaeda recruiting occurred during the 1990s. Critics cried cheap shot.

But just as the cover-up is often worse than the crime, it was Trudeau’s excuse that proved worse than the gaffe. Asked to justify his decision to stage a partisan event at the only North American mosque on an official Pentagon watch list, Justin was without apology, and could only chauvinistically sniff that “the US is known to make mistakes from time to time.” On the charge the place was a hotbed of extremism, he smirked that respecting diversity means “you don’t just speak to people who agree with you,” adding “I’m somewhat different from Mr. Harper in that measure.”

The Tories immediately drew analogies to the Liberal boss’ similarly blasé, politically-correct reaction to the Boston Marathon bombings, but the comments were troubling for reasons beyond national security. Regardless of what terror ties existed at the time of his visit, hateful religious fundamentalism of even the non-violent variety should surely test the limits of what any self-respecting progressive politician is prepared to defend in the name of “diversity.”

In winning three back-to-back elections, the Harper Tories have displayed great skill appealing to voter inclinations usually ignored by traditional mythologies of “progressive Canada” — deep apprehension about wild swings in social policy, and even deeper skepticism towards those who equivocate on matters of crime and wickedness in the name of enlightened thinking.

To Harper’s caution and forbearance, Trudeau’s radicalism offers voters an unprecedented alternative. But it’s probably unprecedented for a reason.

7 Comments; - Discuss on Facebook

Romney 2016

For a land of supposedly endless second chances, it’s striking to notice the viciously unforgiving nature of one of America’s most entrenched political traditions: if you lead your political party to defeat in a presidential election, you never get another try.

Slow news summer that it is, one of the manufactured controversies of the moment is whether Mitt Romney could rise from the ashes of his failed 2012 presidential run to emerge a credible contender in 2016. Former advisors and donors seem to be doing their best to fan the flames, writing editorials, organizing petitions, and leaking choice quotes to anyone that will listen that their guy still has one more fight in him, if only we’d give him a push.

Sounds great to me. Regardless of who deserves the blame, much of what Romney predicted a second term of Obama would bring has been brought, from the post-pullout collapse of Iraqi security to ongoing Obamacare woes to no-end-in-sight dithering on the Keystone pipeline. With 53% of Americans said to want a do-over of 2012, simply announcing “you still can” seems like a pretty compelling pitch at this point. As Allahpundit observes, the I-told-you-so campaign ads “write themselves.”

Yet aside from the man’s personal distaste for another long, expensive slog through the campaign muck (a slog, as the documentary Mitt vividly illustrates, the candidate himself wasn’t even particularly giddy about the last go-round), the lead obstacle to such a supremely rational Republican strategy seems to be American political culture’s entrenched stigmatization of failed presidential nominees.

From where this tradition emerged is hard to say. American history is rife with presidents seeking non-consecutive terms after a loss four years earlier, including Martin Van Buren, Teddy Roosevelt, and Grover Cleveland — who actually won one. In more recent times, we all know the story of Richard Nixon losing to JFK in 1960 only to win renomination — and the White House — in 1968. And obviously there’s scant taboo against seeking a party’s presidential nomination more than once; the vast majority of post-war presidents and vice presidents, in fact, have launched at least one unsuccessful primary bid for their party’s nod prior to getting it.

My own theory is that a series of devastating losses for a succession of weak presidential nominees in the late 20th century — Mondale in 84, Dukakis in ’88, Dole in ’96 — helped solidify a trope of the “presidential loser” (portrayed hilariously in this Futurama scene), in which failed candidates are just fundamentally pathetic, hapless characters.

This runs contrary to the political culture of most other western democracies, in which losers can, and do, lead their parties to multiple defeats before eventually eking out a win.

Canada’s Stephen Harper lost his first bid for prime minister in the country’s 2004 general election, but retained his party’s support to make a second go in 2006, where he won. The new conservative prime minister of Australia, Tony Abbott, similarly lost his first bid for power in 2010 before winning in 2013. Ditto for Israel’s Benjamin Netanyahu, who was his party’s candidate for prime minister five times, but only won three.

The advantage of multiple runs is obvious: a familiar face means less time is spent campaigning on biography and resume — the dreaded “introducing yourself to voters” — and more time on issues. Voters might be sick of you, sure, but that fatigue is not without strategic benefit: having “heard it all before” applies to insults as well as slogans. Just as criticisms about President Obama’s socialism and hidden agendas seemed stale in 2012, so too would the tired tropes of Romney as an out-of-touch aristocrat be pretty boring in 2016. Indeed, Romney could take particular comfort from the fact that many of the overseas conservative leaders cited above were deemed “too right-wing” during their first run only to have that charge seem considerably less frightening after another couple of years with the progressives in charge.

It’s possible the GOP could do better than Romney in 2016, but it’s equally likely they could do a lot worse, too. He’s certainly a man with strategic and ideological flaws worth considering, but the fact that he ran and lost four years ago shouldn’t be one of them.


16 Comments; - Discuss on Facebook

Iraq’s Parliamentary Problem

Recent coverage of Iraq’s internal breakdown has focused mostly on the rampaging horror of ISIL, and rightfully so. But the comparatively drier story of the political decay of Prime Minister Nuri al-Maliki is a tale inseparably linked to that same violence —  or at the very least, the American response to it.

In his recent New York Times interview, President Obama specifically linked his restrained bombing campaign of select ISIS targets with a desire to keep Maliki weak and unpopular. I’m not going to use American power to “bail out” a flaling government, he said, noting that the United States will not be a firm ally of any prime minister until they prove they’re “willing and ready to try and maintain a unified Iraqi government that is based on compromise.”

Understanding the inability of the Iraqi political class to fulfill this demand is a story of the failure of Iraq’s parliamentary political institutions.

The post-Saddam Iraqi constitution gave the country a parliamentary system moulded in traditional European fashion. It featured a party-list based electoral system, a figurehead president appointed by parliament, and an executive prime minister selected from among the factions of the legislature.

In 2005, the year of Iraq’s first general election, a formal alliance of Shiite parties, led by Dawa, an Iranian-backed ex-terrorist group, won a strong plurality of seats, and after months of negotiations with Kurdish and Sunni parties — whose votes were needed for an outright majority — Dawa deputy leader Nuri al-Maliki was confirmed as prime minister (the party’s actual boss, Ibrahim al-Jaafari, deemed too religiously dogmatic).

In elections five years later, Maliki’s Shiite coalition narrowly lost its plurality to the secular, pro-western party of longtime Bush administration darling Ayad Allawim. Yet Maliki was able to stay prime minister by forging a parliamentary alliance with a smaller, more extreme Shiite faction led by a clique of fundamentalist clerics including the now long-forgotten Moktada al-Sadr. This was controversial at the time, but it was consistent with the generally understood parliamentary custom that the incumbent PM should get first crack at forming a coalition government post-election — a precedent ultimately upheld by the Iraqi courts.

Though he had originally come to power with multi-denominational backing, the longer Maliki remained in power, the more brazenly sectarian his government became. This was largely a byproduct of his country’s worsening Sunni-Shiite civil war. A life-long Shiite partisan, Maliki had little qualms about using his position as commander-in-chief to deploy grossly disproportionate violence to crush suspected hotbeds of Sunni extremism (emphasis on suspected) or purge suspicious Sunnis from senior positions in the military, intelligence service, bureaucracy, and cabinet.

Those who expected this dark legacy of division, bloodshed, and favoritism to eventually be rejected by voters were shocked when Maliki’s coalition was able to regain its parliamentary plurality during elections held in April of this year. The Obama administration seemed particularly crestfallen.

Yet good news of a sort arrived this weekend when a fresh procedural bombshell was dropped — word came that Iraq’s president had requested Dawa’s deputy leader, Haider al-Abadi, to assume the prime ministership in Maliki’s place.

Under the terms of the Iraqi constitution, this was within the president’s prerogative — like a constitutional monarch, the Iraqi president is supposed to formally summon the leader of parliament’s “largest bloc” to assemble a government, with Article 76(iii) granting him the additional power to nominate someone else if the initial nominee is unable to get things together within 45 days.

But Malaki had not formally passed that deadline. Despite the fact that this most recent election was held over three months ago, the countdown for assembling a government does not begin until election results are ratified and parliament formally appoints a president —  which only happened on July 24. Malaki is also quite indisputably still leader of parliament’s “largest bloc;” members of his coalition have denounced the president’s alternative guy as representing “no one but himself.” Maliki, for his part, has dubbed the whole thing a “coup,” and some are predicting the constitutional standoff may result in a complete collapse of Iraqi political authority at the moment the country needs it most.

It is, of course, naive to blame any country’s political dysfunction entirely on the system of government they use. Yet it’s hard to deny Iraq’s preexisting political problems have likely been exacerbated by the country’s decision to adopt a complex, European-style parliamentary model, with a proportional representation electoral system that incentivizes politicians who appeal to trans-geographic religious identities and an executive branch that produces rulers who owe their power to a mastery of parliamentary maneuvering, rather than broad-based popular approval.

Had Iraq instead chosen to adopt a blunter presidential system — with a strong executive president elected by multiple-round popular vote and a separately-elected parliament from which ministers could be chosen — many of the country’s problems would doubtless still exist, and possibly even some new ones. Yet the fundamental question of who gets to rule the country would have been far less ambiguous and contestable, and the creation of a unity cabinet much faster and easier.

If Iraq’s political authority does completely break down in coming weeks, the temptation will be strong to insist its people were “never ready” for democracy, and declare the experiment failed. Yet democracy comes in many flavors and the taste Iraqis were given was a decidedly acquired one.

Unfortunately it’s probably too late to try another.


5 Comments; - Discuss on Facebook

The media mess blame game

There was a clever cartoon in the San Francisco Gate some years ago, drawn by the hilarious (and unjustifiably obscure) Don Asmussen. It depicted a newspaper blaring the timely headline: “MEDIA SHOCKED BY DECLINE OF MEDIA — ‘IS THIS THE END OF MEDIA?’ ASKS MEDIA.”

The slow decay of mainstream journalism into a decrepit, profit-hemorrhaging husk is supposed to be one of the great tragedies of our time, and one that’s supposed to provoke media people to produce no shortage of opinions, theories and — most importantly — blame to fling around. That the media may not offer the most objective analysis of this question seems rarely contemplated.

I recently listened to an episode of the Canadaland podcast — which offers weekly media-asks-media analysis of this country’s crumbling journalism scene — about the fall of a short-lived Toronto weekly known as The Grid. The magazine, we were told, “did everything right” but still flopped financially. We were told this by interviews with writers and editors who used to work there, who of course were thoroughly convinced of their own brilliance and competence. There was zero conversation with anyone representing the public, which was a tad odd, as the magazine’s financial failings were explicitly due to unprofitable advertisements, which presumably indicates at least some trouble with audience engagement. Instead, fingers were pointed at the traditional hazy devils: management, technology, “trends.”

There is a legitimate concern that journalists are creating what Marxist-types would call a class ideology; a collection of defenses for self-interested behavior disguised in the language of morality. The idea that the stories that matter the most are the stories the reporter subculture most enjoys reporting on, for instance. Or that journalistic ethics should be forever defined by whatever standards are being used right now.

Having increasingly little power to justify, these ideological tropes now merely constrain journalists’ ability to accurately diagnose their own plight, and dream up viable cures.

John Oliver’s recent viral rant against “Native Advertising” was revealing. At precisely the time folks like The Grid team are bemoaning an advertising-based revenue model that’s failing to deliver the goods, tastemakers of the Official Ideology are waging a furious propaganda war against incredibly lucrative new techniques.

Native advertising is basically just a 2.0 name for “advertorial” content, or an advertisement that takes the form of semi-disguised written copy. It’s not a terribly new practice, nor is it particularly sinister. Two of Oliver’s most horrified examples were a Buzzfeed article about cleaning technology written by Swiffer and a New York Times piece on women’s prisons by the Orange is the New Black people. Yet such mildness is nevertheless denounced as representing a profound existential threat to all that’s right and principled about the journalist’s craft, making those who collaborate “whores” or worse. (One wonders if there were similar conniptions the first time someone suggested printing advertisements in newspapers at all.)

But if media pride has atrophied media skill at what Orwell dubbed the “constant struggle” of seeing what’s before one’s nose, there appears an equally powerful impulse on the part of consumers to abdicate responsibility as well, through lazy populist righteousness that’s no less ideologically destructive.

My friend Graham wrote a fine essay on Medium the other day lambasting an entitled and hypocritical reader class who constantly demand quality journalism, yet consistently resist purchasing online subscriptions, and indeed, go one step further and install ad-blockers to prevent themselves from even inadvertently providing the necessary revenue to finance this want. Graham chalks this up to brazen cognitive dissonance, but I’d also blame the convenient myth of a biased, superficial media that gets trotted out every so often to justify consumer apathy. This chart by I Fucking Love Science, for instance, which purports to show all the stuff “the media doesn’t cover” is quite obviously straw man nonsense, yet such ritualistic denunciations of a supposedly “celebrity obsessed” press etc., provide a necessary veneer of principle to an otherwise entirely selfish abdication of public responsibility.

As is the case with most troubling societal trends, I’m convinced our current media troubles are mostly cultural in root, and demand cultural solutions.

At the very least, self-flattery will get us nowhere.

17 Comments; - Discuss on Facebook

Impeachment Horror

Impeachment Horror

I recently finished reading Jeff Toobin’s A Vast Conspiracy, an epic 448-page chronicle of the Monica Lewinsky scandal, from its earliest beginnings as an obscure sexual harassment lawsuit in Arkansas to the second-ever impeachment of an American president. My interest was sparked by Monica’s recent and very thoughtful essay in Vanity Fair, which brought her decades-old story back into public conversation. The tale’s only become more timely since, now that talk of presidential impeachment (spurious or not) has reentered the headlines.

It seems the minute a president enters his second term partisan foes begin to chatter about whether he’s impeach-worthy. It’s a sentiment born partially from frustrated resentment (no one likes to lose twice to the same guy), partially from opportunism (the Congressional opposition almost always gains seats during a president’s first term), and partially from the White House itself, for whom rallying against an “impeachment obsessed” opposition can be of great material benefit.

So present rumblings over the possible impeachment of President Obama will probably only get louder in coming months. What lessons can today’s giddy Republicans learn from their predecessors’ failure?

First: have a clear-cut, impeachable offense.

It was never entirely clear why Clinton was being impeached, which allowed accusations it was “all about politics” or “all about sex” to fill the ambiguity.

Republicans furiously believed the Clinton White House was hopelessly corrupt, and Clinton himself embarrassing and immoral, yet they ultimately chose to impeach him for two incredibly narrow, legal offenses: lying to a grand jury about his affair with Monica during his deposition in the Paula Jones harassment suit, and obstructing justice by conspiring with Monica in various ways to ensure her corroborating silence.

Constitutional scholars generally agree that presidents can be impeached for just about anything, with the constitution’s vague criteria of “high crimes and misdemeanors” defined through centuries of English precedent to mean, in the famously glib words of Gerald Ford, “whatever a majority of the House of Representatives considers it to be.” Yet Toobin argues the 1990s heralded an era in which the judicial system “took over the political system” and it became received wisdom that political battles should be fought through lawsuits and litigation rather than traditional constitutional mechanisms. Republicans thus decided to impeach Clinton on the grounds he was a petty criminal, as opposed to simply unfit for office.

Second: have the numbers.

In contrast to the impeachment of Richard Nixon, which enjoyed some semblance of bipartisan support, every Congressional vote in the long slog to remove Bill Clinton was almost perfectly party-line.

This rank partisanship doomed Clinton’s impeachment from the get-go. Since the final vote in the process — the one that actually expels the president from office — requires a two-thirds majority in the Senate, even the GOP’s healthy majority in both chambers was not sufficient. Some Democrats had to get on board, but because Clinton’s impeachment was perceived as a hysterically ideological Republican plot (a “vast conspiracy,” if you will), none ever did. This was a direct byproduct of problem number one; because the formal argument for impeachment was confused and weak, it remained powerfully unpersuasive to the other side.

Third: have public support.

Perhaps the most famous factoid of the Clinton impeachment is that the President’s approval numbers actually went up during it. Such sympathy appears even more justified in retrospect; the “peace and prosperity” of the 90s remains enviable, and Clinton’s competence as a administrator, whatever his faults as a man, contrasts sharply with his successors.

Had the Republicans upheld the Founders’ intent, and sought to remove Clinton on the subjective, but entirely legitimate grounds that he was too crooked, unethical, and undignified to be president — as embodied not just by the Monica affair, but Whitewater, Travelgate, the Lincoln Bedroom and whatever else — it’s possible their crusade would have seemed a tad more reasonable. But it would have still failed anyway, simply because the American public did not share this conclusion, and Congress knew it.

President Obama is vastly less popular than Clinton, with large percentages believing he’s behaved improperly in a number of high-profile situations. Yet support for impeaching him sits at a dismal 33%, with estimates suggesting backers are around 90% Republican. And of course even in their best-case 2015 scenario, no one thinks the GOP will be holding two-thirds of the Senate any time soon.

The lasting legacy of the Clinton impeachment was the delegitimization of impeachment in general, and to the extent the episode was a gigantic waste of time perhaps that’s fair. Yet at its core, impeachment is simply a constitutional device for removing an unacceptable ruler, so it’s hard to argue the democratic interest is well-served by perpetuating this cultural stigma.

Even if the answer is no, it remains a proposition worth occasionally proposing.

9 Comments; - Discuss on Facebook - Discuss on the Forums (1)

Fresh battle lines being drawn in America’s culture wars

The culture war is dead. Long live the culture war.

It’s fashionable to observe that many of the most contentious social policy cleavages of the 1980s — when America’s “culture war” meme first went mainstream — are now the stuff of broad consensus.

Debates on the appropriate presence of public prayer have concluded in the minimalists’ favor. Universal legalization of same-sex marriage is perhaps a year away. Conservatives made peace with unwed motherhood to double-down on abortion  and paid for the strategic blunder — a large majority remains in favor of keeping the procedure legal in “all” or “certain” circumstances.

Yet as the previous decades’ debates wind down, fresh moral quandaries about the standards and values of American life emerge. Finding harmonious answers to these new dilemmas of tolerance, identity, and individualism will be a defining struggle of the millennial generation.

For as long as we’ve been fretting about discrimination it’s been said that one man’s innocent quip is another man’s slur. As overt bigotry becomes increasingly rare, it’s the innocent quips that now receive the hottest fire, zealousness having not moderated in conjunction with the lowering of stakes. Today’s tolerance activists speak of “microaggressions,” small indignities of language and manners, such as asking a visible minority where he “came from” or suggesting a woman carry the lighter box. Movies and television shows are now meticulously scrutinized using standards like the “Bechdel Test” to ensure females and minorities are portrayed in the most flattering, self-actualized fashion, with an equally fussy eye cast towards nouveau sins like “exoticism” and “othering.” Even perfectly post-modern public figures like Stephen Colbert and RuPaul have proven but a mere ill-utterance away from triggering shame campaigns from a self-appointed vanguard of “what’s not okay.”

This goal of a zero-tolerance culture is invariably at odds with unsuppressed freedom of expression and that which it produces: honest commentary, diverse storytelling, insightful humour, complexity of language and thought. The critics seek to restore the legitimacy of censorship, or at least self-censorship, in which an extra layer of nervous second-guessing must be applied to all intellectual and creative output in order to make ideas subordinate to a ruling ideology capable of punishing dissent.

Struggles over acceptable means of restraining intolerance are in many ways the outgrowth of a larger philosophical split over the nature of identity, and the privileges a citizen should claim through self-applied labels and group affiliation.

Transgender Americans have found success lobbying for legal inclusion as a class protected from open discrimination. Yet anxieties remain over the community’s existential thesis that gender itself is inherently fluid and subjective — an assertion for which the science is hardly settled, yet was recently decreed official fact by the school board of Vancouver. Today we have autistics who oppose efforts to cure their condition on the basis they were merely born different, not ill, a resistance to the “medicalization” of identity shared by an increasing assortment of unpopular demographics, including the obese (who dismiss BMI measurements as quackery), serious drug users (who euphemise their “abuse” as simply “misuse”), and schizophrenics (who self-identify as “multiples”).

Homosexuality was of course once considered a mental illness. That’s no longer the case today, yet whatever biological variables do explain the phenomenon remain ambiguous, and many are not eager to see things clarified. A faction that values the cultivation and preservation of diverse identities and stigmatizes efforts to assimilate or “fix” deviant behavior is destined to clash with those seeking definitive scientific explanations for life’s mysteries.

A similar cleavage divides those who fetishize the total supremacy of the individual against those who worry about behavior’s societal consequences. A friend of mine recently wrote an essay fretting about the sadomasochism renaissance prompted by Fifty Shades of Grey; he worried that a culture elevating individual “consent” to its highest good will be one thoughtlessly normalizing behavior that’s socially destructive in a broader sense — in this case, violence against women. Increasingly loud proposals to normalize other historic taboos — recreational drug use, prostitution, violent video games, child pornography — spawn similar concern. The anti-individualists ask at what point the pursuit of “harmless” personal pleasure corrupts the virtues of the larger society these individuals comprise. The individualist-supremacists flatly deny the possibility.

None of these tensions are particularly new, but the battlegrounds are decidedly 21st century. As opinions congeal around fresh struggles to balance choice, evidence, identity, and opinion, old understandings of the political divide are overthrown. They pose a particular threat to the progressive left, which may be doomed to split into warring post-modern and libertarian factions.

Divisive, ideological, and often personally threatening, the culture wars of the future will not be pleasant. But few wars are.

26 Comments; - Discuss on Facebook

Lack of Pride

“When is Pride this year?” straight friends ask, voices raising with equal parts excitement and condescension. As a gay, I’m presumed to be well-versed in such things, but alas, there’s no easy answer. There exists no single “Pride,” after all, simply a long sequence of independent extravaganzas across North America, each observed on conveniently different dates.

Los Angeles Pride happened long ago for instance, on the traditional first weekend in June. San Francisco pride — the big one — has come and gone as well, occurring, as it does, on the last weekend of that same month. Vancouver Pride kicks off this Saturday, and Vegas Pride comes a month after that, in early September. Countless other cities are sprinkled somewhere in-between.

Such cleverly staggered scheduling, which allows the don’t-stop-the-music set to engage in summer-long “Pride tours” across the continent, hopefully helps illustrate the fundamental vacuousness of this would-be holiday. It’s certainly one of many variables justifying my profound disinterest in it. Despite being gay for as long as I can recall, I not only shun Pride, I actively resent the implication that attendance offers any meaningful indication of one’s GLBT acceptance.

I’ve always felt a bit of sympathy for Rob Ford’s various mumbled explanations of why he’s never attended Toronto Pride during his four years as mayor. His no-show status has of course been widely taken as proof of his supposed homophobia, but his official excuse — that he’s simply an old-fashioned guy who finds the garish flaunting of sexuality uncomfortable — seems perfectly reasonable. To modern elite opinion-makers, however, who have done so much to inflate Pride as the culture’s leading litmus test of tolerance, personal uneasiness is a sentiment so exotic it may as well be uttered in Swahili.

While I’m no prude — actually, strike that, I am a prude. And what of it? Flipping through online albums of Toronto Pride 2013, one finds ample documentation of S&M bondage couples, barely-there thongs, buttless chaps, and all manner of grinding, thrusting, jiggling, and twerking. It’s perfectly acceptable to find such things gross or distasteful, and an exploitive cheapening of both sex and the body.

It is no great character flaw to value modesty or dignity, nor is it bigoted to esteem forbearance and control. Libertine attitudes towards sex, nudity, fetishism, and exhibitionism are issues entirely disconnected from the civil rights matter of whether peoples of divergent sexual orientations are deserving of the same rights and protections of those in the majority. To argue the contrary is to claim possessing a minority sexual preference should be synonymous with sexual deviancy in general — a premise not only dated, but dangerous.

There was a clever Onion piece published more than a decade ago (I doubt such a thing would be written in this more sensitive age) headlined “Local Pride Parade Sets Mainstream Acceptance Of Gays Back 50 Years.”

“I thought the stereotype of homosexuals as hedonistic, sex-crazed deviants was just a destructive myth,” the paper quotes one horrified onlooker. “Boy, oh, boy, was I wrong.”

Sounds about right. Indeed, one has to wonder just how much comfort Pride is even attempting to offer the genuinely sexually conflicted at this point. Considering how much “coming out” anxiety tends to center around fears of lost normalcy, it’s not clear at all how declaring common cause with society’s most brazen display of freakshow non-conformity is a useful means to that end.

Looking at photos of North America’s earliest Pride parades is a window into a different world. The marchers of those days, calmly holding hands with their same-sex partners in sensible polo shirts and penny loafers, were certainly subversive, but only to the extent they were seeking to remind a society in denial of the unavoidability of their existence, and the bland, non-threatening nature of it. Theirs was a call for inclusion in the most literal sense, the welcoming of homosexuals into society’s most central institutions: family, work, religion, politics, and the acceptance of their love as valid as any other sort.

That goal having now largely been achieved, the Pride movement, like so much of the modern Gay Rights activist complex, has become a victim of its own success. As North Americans get used to people being here and queer, the moderate LGBT middle class has drifted away from leadership of the tolerance movement, allowing the wild fringe to fill the void. What results is a historical irony: just as society is most eager to assert its tolerance, Pride redefines the deal. Endorsing the acceptance of ordinary people distinguishable only by what gender they love now demands an additional stamp of approval for all-purpose indecency and licentiousness.

Politicians, corporations, and all manner of interest groups clamor to agree to the terms. But for an increasing lot of gays, it’s hardly obvious why we should care.

14 Comments; - Discuss on Facebook

Indefensible Hamas

Indefensible Hamas
  •  emoticon

There are plenty of perfectly good criticisms to be leveled against the State of Israel. Personally, I’m quite troubled by the so-called “demographic time bomb” theory, which posits that Israel’s increasing Arab and Palestinian birthrates ultimately doom the Jewish nation to embrace some ugly form of minority-rule. And of course we’re all well-versed in the gross spectacle of settler expansion into the West Bank, a brazen effort at colonial growth at exactly the moment the Palestinian territories are supposed to be inching towards independence.

Yet the mere existence of Israeli sin should not blind anyone to the greater evils of its enemies.

This is the sort of blunt moral judgment that’s been traditionally uncouth among fashionable western progressives, who, often feel the need to affect great open-minded exasperation at the Israeli-Palestinian conflict, bemoaning that “fault exists on both sides.” Such is the default position of those ideologically inclined to regard assertive side-taking as a symptom of an unsophisticated mind, with “blind” support of Israel in particular a worrying proxy for some other form of close-minded ignorance  — Millennialist Christianity, perhaps.

Yet in the wake of the current war between the Israeli government and the Islamic Resistance Movement — better known as Hamas — that’s running the Gaza Strip, even the traditional progressive skepticism seems to be breaking down. As Israel’s Palestinian resisters become more nihilistic and radical at precisely the time the Israelis are getting more sensitive and cautious, the lopsided moral imbalance is becoming harder to ignore.

The traditional Israel-bashers are certainly looking more pathetic than usual. The buffoonish United Nations Human Rights Council drew up a monstrously biased report on the Gaza war the other day, which predictably sailed to approval on the votes of the various third world dictatorships who comprise the body’s largest bloc. Yet it was telling no nation resembling a first world democracy could be persuaded to support it. Of the 17 abstentions, almost all noted with concern that the Council’s chronology of the conflict was a bit one-sided, to put it lightly. The brusque four-page report does not include the word “Hamas” once, and instead speaks only of Israeli aggressors inflicting “widespread, systematic and gross violations of international human rights and fundamental freedoms” against the hapless peoples of “Occupied Palestine.”

Nowhere was it mentioned that the Gaza Strip actually ceased to be occupied back in 2005, as the late Ariel Sharon painfully extracted every remaining Jewish settler and soldier from the territory.

Nowhere was it mentioned that Hamas explicitly pledges to “obliterate” the state of Israel in their founding charter — “by Jihad,” in fact.

Nowhere was it mentioned that Hamas leaders have long spoken of “Jews” in the most generic as their enemy, and that their preferred military tactic in the current conflict — lobbing over 2,500 missiles into major population centres — have made urban Israelis the war’s true civilian targets.

Nowhere was it mentioned that Hamas has transported weapons in ambulances, housed missiles in schools, mosques, and hospitals, and disguised their fighters in Israeli uniforms — all clear violations of the codified laws of war.

Nowhere was it mentioned that the Israelis have so far discovered over 30 multi-million dollar “terror tunnels” spiraling out of Gaza (built in part with alleged child labor) that serve no purpose other than to turn western Palestine into a launchpad for guerrilla aggression against its neighbor.

Nowhere was it mentioned that just a few days prior, Hamas refused a comprehensive ceasefire backed by basically everyone who matters: the Egyptian government, the Arab League, the United Nations, the EU — even old man Obama, if anyone still cares about him.

Nor, for that matter, did the report mention the exceedingly cautious conduct of the Israeli forces in what they’re calling “Operation Protective Edge,” a reputation-conscious nervousness so thoroughly unprecedented in modern warfare it’s almost certainly harmed national security.

While Israeli civilians have been largely protected from Hamas rockets by the country’s awesome Iron Dome missile defense system, Palestinian civilians are protected by an Israeli shield of their own: an elaborate system of advanced warnings to residents of Gazan neighborhoods targeted for bombing. The system includes everything from text messages, personalized phone calls, noisemaking “dummy bombs” (so-called “roof knocking”), and even airdropped maps steering civilians to refugee centres. Such has been the IDF’s painstaking effort to mimimize causing casualties while attacking one of the most densely-packed places on earth, yet Hamas has ensured the Palestinian death toll has remained high anyway, glibly encouraging Gazans to dismiss Israeli warnings as “psychological warfare.”

Prime Minister Netanyahu took some flak for noticing that last bit, concluding on American television that Hamas seems to enjoy the existence of “telegenically dead Palestinians.” Yet it’s a indictment that’s difficult to avoid given how effective the conflict’s 570 Gazan victims have proven in forming a narrative of “disproportionate death” — the only argument Hamas can peddle for foreign sympathy. In any case, surely a group cynical enough to engage in talks with North Korea to replenish their depleted missile supply would hardly balk at the indignity of ratcheting up its own body count for propaganda purposes.

A dispassionate analysis of facts like these — facts which are not the result of clever cherry-picking on my end — but simple observation on the broad character of the Gaza conflict to date, cannot help but lead to a simple conclusion: Israel is better than Hamas.

To conclude this isn’t to posit that Israel, and the current Israeli government in particular, is without failing in other contexts, nor to even make a value judgment about the broader merits of Zionism, if you’re still a skeptic. It’s simply to note that what we have right now is a secular, liberal democracy fighting the aggressions of a lunatic death cult who seized power in a military coup and are actively loathed by the long-suffering captives it purports to rule. With tendentious conduct resulting.

Whether that’s an accurate summary of the Palestinian-Israel conflict in general, it’s certainly true of this one.

It demands an appropriate reception.

67 Comments; - Discuss on Facebook - Discuss on the Forums ()

Power Suit

Power Suit
  •  emoticon

What makes the American model of government superior to most others is its elaborate web of checks and balances. Like a mobius strip, the chart of American government depicts three branches each extending an arrow of oversight towards the other two, creating a tightly interlocking network of watchmen being watched. No matter what one branch does, the others always have venues of recourse.

On paper, at least. In practice, alas, not all checks are equally balanced.

While no one disputes the blunt effectiveness of a president vetoing a bill of Congress, the Senate refusing to confirm a judge, or a judge rejecting an unconstitutional decision of the White House or legislature, Congress’ ability to reign in the executive has always proved the most daunting challenge.

A presidential veto can be overridden by Congress, but that requires the two-thirds approval of both chambers, something only possible in the case of legislation boasting enormous, bipartisan popularity, such as the 2008 Medicare funding bill unsuccessfully vetoed by George W. Bush, or President Clinton’s attempt a decade earlier to cancel popular military spending initiatives in a variety of districts held by politicians of both parties. In all, there have been less than 10 overrides in the past 20 years.

Then there’s impeachment, which though actually easier than overriding a veto — requiring, as it does, merely a two-thirds majority in the Senate and a simple majority in the House — has become perhaps the single most stigmatized provision of the US constitution. America’s long tradition of presidential stability has made even contemplating the removal of a president mid-term a taboo of enormous proportions, a fact only further complicated by the legacy of the Clinton years, which established something of a legal-cultural consensus that presidents only deserve to be unseated for serious criminal misdeeds, as opposed to merely moral or political ones.

To be sure, Congress can handicap a president. They can defund his pet projects, as Republicans are always threatening to do with Obamacare, or simply ignore his requests for action, as has been the case with… well, you name it. But as modern presidents have embraced an increasingly maximist understanding of their constitutional powers, the rising challenge for Congress has been the question of how to restrain  a president whose most objectionable decisions are made unilaterally.

Barack Obama has often interpreted his mandate in unusual ways. A common refrain, echoed most recently during his rose garden vow to “fix as much of our immigration system as I can on my own without Congress” is that the need to make policy supersedes the need to respect constitutional procedures for making it.

In the case of immigration, the President is tilling familiar ground. In 2012 he unilaterally declared a two-year amnesty (since extended to four) for the approximately 800,000 illegal immigrants who arrived in America as children. It was a move explicitly intended to compensate for Congress’ failure to pass the so-called Dream Act a year earlier, which promised similar legal relief for America’s inadvertent aliens. Where legislation failed, rule-by-fiat would succeed.

Selective enforcement of the law has likewise been the preferred Obama approach to drug policy. In 2009, Attorney General Holder declared the United States would not enforce federal drug legislation in states that had legalized marijuana for medicinal purposes, and in 2013 he expanded that blind spot to include states that legalized it for recreational use, too. The Justice Department has announced similar plans to stop prosecuting drug offenders when they deem the mandatory punishments excessively harsh. The underlying logic, apparently, is that laws should only be upheld to the extent they serve the President’s ideological ends.

Then there’s Obamacare, whose finer points were all implemented through executive action, most notably the imposition of the everyone-has-to-have-insurance-now deadline (Congress’ law said six months ago; the President says 2016), but also this whole business of forcing employers to cover morning-after birth control that the Supreme Court recently designated an unjust burden on corporate religious freedom.

In response to the administration’s handling of the Obamacare rollout in particular, Speaker John Boehner has announced he plans to sue the White House for unconstitutional behavior, namely a dereliction of the duties mandated by Article II, Section 3: “[The President] shall take Care that the Laws be faithfully executed…” Though what specific redresses the suit will seek have yet to be disclosed, an ideal ruling would presumably compel the administration to begin imposing the Obamacare insurance mandate right away — you know, like the law was supposed to.

Is this wise? The legal establishment seems skeptical. Asking the judicial branch to resolve a conflict between the executive and legislative has little precedent in American history, elevating, as it does, the courts to the status of supreme referee of intergovernmental jurisdictional disputes — itself a proposition of dubious constitutionalism. On the other hand, the more constitutionally orthodox prescription for a Congressional problems with a president — impeachment — seems not only absurdly radical, but politically suicidal. But still, you gotta do something.

President Obama’s Republican predecessor, of course, faced constant abuse of power criticisms of his own, though it’s worth noting that much of the Bush-bashing involved disputes over what is and isn’t within the president’s prerogative as “commander-in-chief,” one of the constitution’s most disputed phrases. In the end, Congressional Democrats elected to do little more than obstruct, complain, and run out the clock — a technique Republicans may ultimately have no choice but to emulate.

Term limits have always been controversial, but they remain the only long-term defense against an executive restrained by little else.

28 Comments; - Discuss on Facebook - Discuss on the Forums (0)


  • Recent Posts

  • Cartoon Archives