News






 

Why does Scotland want to leave?

As a foreigner, it’s often hard to view the cause of Scottish independence as anything but hopelessly impenetrable. By definition, any fight in which one rich, democratic, first world nation seeks to secede from another will be fought over exceedingly small beans —usually a constitutional power imbalance between federal and local political authorities that’s more theoretically problematic than anything  — simply because there’s little else left. Ably represented by both a parliament of their own and 59 seats in London, no Scot even rhetorically purports to being “oppressed” by the English in any genuine way — their nationalist movement is merely born from a people whose patriotic ambition is too great to be realized as a minority, even a disproportionately powerful one, within a shared-power state. Theirs is a uniquely 21st century movement of liberation, where worries of aesthetics and self-actualization sit in place of war and tyranny.

As the only other G7 nation with a viable separatist movement, Canadian elites tend to take special interest in Scottish nationalism (often to an embarrassing degree, as was the case with the Globe and Mail’s painfully patronizing “open letter to Scotland from your Canadian cousins”) and the parallels between Scotch separatism and the French-Canadian variety are broad indeed.

Like Quebec, Scotland was absorbed into a more powerful nation amid protest, which gave rise to a remarkably flexible culture of perpetual grievance. A deep animosity towards the colonizer, originally rooted in crimes of centuries past, has proven easily adaptable to modern concerns.

In recent years, the most popular mutation in both societies has been a sort of leftist superiority complex based on a greater embrace of statism than the cruel and stingy motherland — Quebeckers brag about their cheaper colleges and seven-dollar-a-day daycare, Scots of their more generous pensions and higher-quality public heath care. In both cases the oppressed parties whine about the fundamental indignity of having their social-democratic ambitions held back by an unrepresentative right-wing federal government; in both cases, this ideological righteousness is undermined by fiscal hypocrisy. The comparatively generous nanny states of Scotland and Quebec are only sustainable through deficit spending sustained by hearty federal subsidies (and in Scotland’s case, federal oil royalties), a paradox to which the independentistas offer pride in place of solution.

Yet the Quebec analogy is not precise, and in some ways Scottish nationalism perhaps more closely parallels the insecure nationalism of Canada itself. Both Scots and Canadians affect enormous offense at being mistaken for their largely culturally-indistinguishable southern neighbors, for instance — (indeed, in early Canadian history one often comes across the Scotch/English metaphor as a preferred analogy for the English-speaking nations of North America) and the rawness of this insecurity has ensured no price is to high to pay for the protection of “cultural sovereignty,” even when that entails government’s heavy-handed manufacture of cultural distinctions where none previously existed.

Outsiders are often surprised to learn, for instance, that much of the charming quaintness we associate with “ancient” Scottish culture —family tartans, caber-tossing, highland dancing, the Loch Ness Monster, etc. — are actually decidedly recent creations born from the often wildly speculative Celtic revival movement of the late 1800s and early 1900s —  analogous, in the Canadian context, to Ottawa’s post-war creation of patriotic paraphernalia like the maple leaf flag and the Order of Canada. Edinburgh’s efforts to impose Gaelic (spoken by around 1% of the Scottish population) as an official language and encourage its teaching in schools in spite of any readily-apparent public need or desire will ring similarly familiar to anyone familiar with Canada’s own linguistic flights of fancy — bringing French to the arctic, say.

Whether independence is right for the Scots is obviously their question to answer, and I won’t claim to understand the utilitarian nuances of the pro and con pitches, which, as previously noted, are bound to be exceedingly technical in the absence of anything more pressing.

That said, Scotland is not an impossibly foreign land, and it’s hard to escape the impression that Scottish separatism is a cause bound up in many of the destructive political fads distressingly familiar to western nations of all stripes, including the professionalization of victimhood, the belief that paranoid segregation is more principled than than cooperative assimilation, the state-backed construction of identity as a means of denying a naturally-occurring one, and the belief that a minority’s quest for “justice” should supersede — or even violate— their economic self-interest.

If voting “No” helps curbs that tide, I’m all for it.

Attack of the Giant Cell Phones

A while ago, CNN.com commissioned me to draw one of these giant “longform” essay comics that are all the rage these days. The topic they gave was giant cellphones, which I have to admit, was not a subject I was incredibly passionate about, but I did my best to make it interesting just the same. It was the first time I ever worked for a client as big as CNN, and the final product is the result of a lot of back-and-forth edits and revisions, but hopefully the final product is still recognizably my own.

Check out “Attack of the Giant Cellphones” here.

Bigger problems in public education

My home province of British Columbia is currently providing the world with a useful service — a vivid case study of all that’s wrong with government control of education.

B.C.’s public school teachers were given the right to strike in 1987, and in the years immediately following most did, disrupting classes in 48 of the provinces’ 59 school districts. In 1994 the NDP government of the day responded by imposing province-wide contract bargaining, supposedly to help make the employer-employee relationship more efficient and cost-effective. It’s proven neither.

British Columbians of the millennial generation — such as myself — became Guinea pigs of this bold labor experiment. Anyone who attended grade school in B.C. over the last decade saw their K-12 education upset by no fewer than six distinct periods of turmoil born from the consolidated might of the British Columbia Teachers’ Federation (BCTF), including a one day (2002) 10-day (2005) and three-day (2012) province-wide strike, as well as numerous lesser “job action” boycotts of extra-curricular duties. When I was 12th grade student president, for instance, I had to hire private security firms to supervise our dances because no teachers would.

Such tactics seem to work. B.C. teachers have received an annual raise most years since 1994, though the union-government relationship has proven so hateful and obstinate three of four post-’94 contracts had to be imposed by parliamentary diktat following negotiation breakdown, with the sole successful one largely a hurried effort to buy labor peace in the run-up to the 2010 Vancouver Olympics.

Teachers now want another raise, and the Liberal government nods furiously that they certainly deserve one, as the poor souls only make a mere 70–80 grand a year, a salary shared by such indigents as cops, lawyers, and engineers. And that’s not counting an additional $10,000 in benefits so exhaustive, current negotiations center around things like fertility drugs and  massage therapy simply because there’s little else left. According to The Globe and Mail, test scores have been slumping in the province over the last 10 years, but government long ago discarded the notion that collective pay should be tied to collective performance. The result is the standard dysfunction of state-run industry: it costs ever-more to deliver — at best — status quo results.

The Education Minister says he’s prepared to offer a 7% pay hike over six years; the teachers want 8% over four. They also want to teach fewer kids in general and fewer disabled children in particular, the sort of basic job description issue labor unions were never intended to negotiate, but something the BTCF considers its inherent right to control. Theoretically, government could just once again legislate a hard answer to this question, but recent Canadian jurisprudence has declared collective bargaining to be a constitutionally protected right, and the BCTF is spending a lot of its members’ money at the moment attempting to exert this right in the hopes the courts will render permanently invalid a 2002 law that withdrew classroom issues from the realm of negotiable matters.

A government monopoly on education coupled with the BCTF’s monopoly on educators has trapped BC schools in an endless cycle of blackmail: taxpayers are made to bow to ever-growing union demands or face tremendous disruption in the lives of their children. This extortion is softened through savvy PR that seeks to portray teachers as selfless victims of sadistic politicians, a villain the public can be easily mobilized against.

Those who remain unmoved by the sorts of sheltered teacher sob stories that form the backbone of this strategy (“sometimes I have to work at home!“) are accused of lacking empathy, though in my case I find having a parent in the profession only makes my heart harder. My mother taught high school for 30 years and was certainly hard-working and all the rest of if, but any complaints I heard always seemed more than compensated by what appeared to be a virtually limitless bag of sick days, half-days, personal days, conference days, and professional development days to soften the blow. Plus all of summer to spend with her family — and much of greater Christmas.

In any case, a teacher who complains about how hard she works is basically a sucker. Since successive BCTF contracts have made it progressively impossible to fire a teacher, there’s really no incentive for them to perform beyond the bare-minimum of competence, an unfortunately popular standard. Hard-working teachers who produce great results are far more victimized by the inherently socialist nature of collective bargaining than stingy governments, since province-wide, one-size-fits-all wages by design provide no incentive for the good to get better, but all the incentive in the world for the bad to remain so.

The current strike, which began in June, is the longest in British Columbia history, and this longevity has increased a culture of polarization in which one is expected to profess unwavering fidelity to either the embattled teachers or the besieged Liberal administration. I have little natural affinity for either, reserving my harshest contempt for the larger system that produced both.

Murdered Aboriginal Women and the Politics of Moral Panic

At one time, the left was rightly skeptical of conservatives who sought to manufacture public hysteria as a means to achieving their ideological ends. Today, they’ve become masters of the craft.

Canada is currently in the midst of something of a fashionable fluster over the plight of “missing and murdered indigenous women,” a five-word phrase that’s become a ubiquitous part of the Canadian socio-political vocabulary, complete with its own hashtag. There now exists an archipelago of institutions devoted to the #MMIW cause, which presents itself as one of modern Canada’s darkest crises. The common refrain that the federal government has an obligation to commission a report on the phenomenon reached a fevered pitch this week as the various premiers of the Canadian provinces unanimously threw their weight behind the idea.

The only problem? There is no national epidemic of missing or murdered aboriginal women in Canada. Or at the very least, this exceedingly specific worry is not supported by any exceedingly specific data.

According to RCMP statistics, the percentage of female Canadian murder victims possessing aboriginal ancestry has remained largely constant over the last three decades, at around 14%. While this figure is extraordinarily high given that aboriginal women only comprise about 4% of the country’s female population, such victim overrepresentation is no less true of aboriginal men, whose disproportionate murder rate is an even larger statistical outlier (17%). Likewise, since most aboriginal murder victims die at the hands of other aboriginals — usually family, friends, or lovers — such high victim rates largely reflect the fact that aboriginals of both genders simply murder more, period.

These are dark statistics, but they illustrate a broad phenomenon, rather than a narrow one — aboriginal Canadians are the most homicide-prone demographic in Canadian society at the moment, and their violence is mostly internal. This, in turn, is the predictable outcome of the fact that aboriginals disproportionately suffer from the well-known social pathologies that make Canadians of all races more likely to murder and be murdered — broken homes, drug and alcohol abuse, poverty, lack of education, lack of opportunity in general, and so on. Summoning Ottawa to investigate aboriginal murders would be akin to asking the government to investigate murder in general. Canada’s grotesquely high aboriginal homicide rate can be explained succinctly: the same reasons, only more so.

But if the stats are so unambiguous, from where did the panic about missing and murdered indigenous women arise?

Much of it seems born from the legacy of the 2007 trial of the hideous Vancouver serial killer Robert Pickton, who’s estimated to have slaughtered as many as 49 prostitutes at his pig farm during the 1990s. Pickton’s arrest offered a grim conclusion to what had previously been one of the city’s great mysteries — why had so many prostitutes gone missing over the years — and a related class-critique follow up — why wasn’t more effort being exerted to find them? A 2012 report commissioned by the British Columbia government concluded the predictable: law enforcement was indifferent to the underclass.

About a third of Pickton’s 33 identified victims were aboriginal, a fact which probably owed more to the disproportionate presence of aboriginal women in the Canadian sex trade than anything else. Yet in the aftermath of his arrest, a narrative emerged that the broad societal nonchalance to his victims’ disappearances and deaths offered a useful case study of the particular plight of Canada’s native women, while the murders were symptomatic of a predatory white patriarchy prone to viewing indigenous women as subhuman and expendable.

Aboriginal and feminist activists had an obvious interest in pushing this perspective, yet as is the case with most serial murderers, it was always a stretch to claim the Pickton phenomenon was representative of anything beyond his own idiosyncratic evil. Under ordinary circumstances, the RCMP estimates 88% of aboriginal murder victims have their cases promptly solved by the police — a rate indistinguishable from victims of other races — and instances of white-on-Native violence, to say nothing of the spectacularly sociopathic sort, remain rare in Canada.

The rates at which aboriginal men and women are killing each other in this country is a national disgrace, and, as the prime minister recently declared to absurd controversy, a matter that deserves the full focus of the criminal justice system. But to reframe the status quo through narrow narratives of neo-colonialist misogyny is to bury an unsensational problem of generic criminality beneath trendy academic ideologies of gender, race, and privilege — largely for the benefit of those who truck in such theories.

To knowingly peddle distractions on the pretence of seeking justice for victims is to commit a tremendous disservice to both.

Pastafarians and Progressivism

I was doing a hit on Sun News this morning (“doing a hit” is what we media bigshot types call “going on a show”) and the guy hitting after me was Obi Canuel, a devotee of the Church of the Flying Spaghetti Monster. Obi’s currently waging a quixotic — and highly media-friendly — battle with the British Columbia DMV to let him take his drivers’ license photo with a colander on his head, as his “faith” demands.

I spoke a bit with Obi after the fact and found him to be a good-natured, gentle fellow, if perhaps a tad naive. I asked him about the support he’s getting for his crusade and he got sort of wide-eyed. “To be honest,” he said, “some of my supporters are pretty… racist.” To which I wanted to reply, “well, duh.”

The whole Spaghetti Monster thing is an interesting metaphor, and not just in the way the Spaghetti Monster people themselves want.

Pastafarianism, as the quote-unquote “religion” is known, was initially just one more way for smug liberal atheists to have fun mocking fundamentalist Christians, this time via the magic of Dada absurdism. The Flying Spaghetti Monster church believes global warming is caused by pirate ships and rejects the theory of gravity in favor of “Intelligent Falling.” It’s not hard to grasp what they’re going for with this.

Folks like Obi, however, take things a step further. By insisting the state recognize their right to wear colanders on their heads in some obtrusive context, the teasing shifts away from the politics of evangelical right-wingers and towards the well-known headgear accommodation demands of Muslims and Sikhs. This is more delicate territory.

There are a lot of Canadians who believe government concessions for religious minorities — what the Quebeckers refer to as the doctrine of “reasonable accommodation” — have tilted too far in favor of third world immigrants and their unapologetically exotic customs. Customs rooted in religious or cultural assumptions that threaten to disrupt or undermine the values of Canada’s majority, such as extraordinarily draconian codes of modesty or demands of female subordination. I don’t consider this kind of attitude “racist” per se, but in the sense we now carelessly slap that label on anyone possessing any anxiety about minority behavior, it’s obvious why “racists” of this sort would find common cause with a guy making biting mockery of a famous minority demand.

As I noted in an earlier essay, I believe one of the defining political cleavages of coming decades will be a growing tension between the pro-science left and the post-modern left. This is because 21st century progressives have yet to definitively decide what they’re all about — embracing the hard answers of  biophysical reality, or the non-judgmental acceptance of all identities, behavior, and beliefs as equally valid.

Proud atheism is widely demanded as evidence of one’s seriousness as a impeccably rationalistic thinker, a virtue held in highest regard by leftists who want to place data, technology, and centralized planning at the core of policymaking. Religious argument has no place in this world; indeed, it’s seen as the stuff of extraordinarily counter-productive ignorance and superstition.

But while secular liberals are generally comfortable leveling such harsh words against white Christians, whose privileged standing in society is taken as self-evident, there is of course no shortage of multicolored overseas faithful who believe “anti-science” things just as hard, and their domestic presence is steadily increasing thanks to an aggressively multicultural immigration policy. For the progressive, this posits a crisis: one can criticize minority religions with the same vigor and fire heretofore used to bash Christians, and thus flirt with “racism,” or give minority religions a pass in the name of respecting diversity, and flirt with anti-scientific superstition.

The notion that the left can maintain a permanent ideological coalition of people inherently tolerant of all cultures and inherently critical of religious justifications for public behavior seems inherently fraught. I don’t know what strategy could possibly be employed to keep these two sides, with their wildly distinct notions of “truth,”on amicable terms.

But as a conservative, I guess it’s not really my problem.

 

Ferguson Biases

The Ferguson story is complicated, but its reception is not. We do not know how or why or under what circumstances young Michael Brown was killed by officer Darren Wilson, but we do know, in astonishingly precise detail, how we feel about it.

To say the Ferguson tragedy is being “politicized” isn’t entirely accurate, since (as of now) few figures of high politics have attempted to exploit its heated aftermath for partisan gain— with the possible exception of the ever-scrounging Rand Paul. But most analysis of the episode has certainly been reflective of American political culture, and the various interests and assumptions that increasingly define it.

This includes:

  • The maintenance of perpetually adversarial relations among racial groups through endless stoking of fear, suspicion and grievance;
  • The dismissal of much of established authority as illegitimate, with power that derives solely through force and oppression, usually exercised with extreme prejudice;
  • The inflation of shameful incidents into symptoms of larger “disturbing trends” to shame and frighten the public into a state of moral panic;
  • Conspiracy theories that “the media” works as a unified force to actively spread misinformation and sire hate;
  • Notions that “justice” is not a neutral evidence-based process, but a retributive weapon to be wielded against those we believe to have done wrong.

Such attitudes are the essence of Colbertian “truthiness” — the idea that realities about the world should derive from plausibility, rather than fact — and if left to fester, will rot any civilization in which they’re tolerated. That what happened to Michael Brown is undeniably a tragedy does not excuse doubling-down on destructive ideological clichés any more than it excuses rioting or preposterous police militarization.

America, quite simply, does not have a police-murdering-innocent-people problem. It may well have a disproportionate arrest of African-Americans problem (an at least partial outgrowth of its disproportionate number of crimes committed by African-Americans problem) and even a police brutality problem, but to argue Michael Brown’s death was in any way representative of some greater phenomenon— as many activists are doing — is to ignore precisely what made his killing newsworthy in the first place. Namely, its rarity.

As NPR reported last month, of the over 98 million arrests made in the United States over the last seven years, around .005% of them resulted in what the Justice Department deemed a “homicide.” Even if 100% of all such victims were black, which they obviously are not, it remains an occurrence so freakishly obscure it resists much use as a valuable statistic of larger social trends. Larger social trends, incidentally, reveal massively declining black youth arrest rates.

Likewise, though the #ifiwasgunneddown hashtag has prompted much sympathetic nodding of heads, there’s equally scant evidence the “press” as some collective entity plots to present the public image of black victims of violence in the worst possible light. On the contrary, in the Treyvon Martin case, much greater controversy was had over the media not portraying him thuggishly enough, and instead erring on the side of progressive political correctness by using an outdated, innocuous photo.

It should also be said — and it’s sad that this is even an insightful thing to note these days — no one really knows what happened to Michael Brown. As I write this, the Chief of the Ferguson police department has just released security cam footage of Michael Brown minutes before his death, in which he can be seen stealing a box of cigars from a convenience store and roughing up its clerk. This is of course still miles away from an acceptable justification for killing an unarmed youth, but it does remind that these sorts of stories often involve a lot of moral gray, with facts that refuse to neatly conform to the good vs. evil morality play many desire.

A peaceful society is born from one whose citizens treat each other with tolerance and compassion, and where the disciplinary powers of state authority are exercised with sensitivity and restraint. But so too does peace require a populace whose reality is shaped by observation and fact, not hostile suspicion and pre-determined conclusions.

To cling without proof to assumptions of the most dark and cynical sort is to make ourselves not merely intellectual prisoners of our worst fears and biases, but architects of the very world we claim to want to avoid.

Trudeau’s Promised Extremism

Whether a politician can harm himself veering too far to the social left is not a question we’re used to contemplating, destructive extremism on issues like abortion and Islam being traditionally understood as dysfunctions of the right. Yet if Canada’s Conservatives have their way, the dangers of unchecked social progressivism will be one of the defining themes of the country’s 2015 general election, which will pit the square, bourgeois sensibilities of Prime Minister Stephen Harper against the dramatic permissiveness of Liberal leader Justin Trudeau.

Young and peppy, Trudeau’s energetic spontaneity can be an obvious asset, but it’s also resulted in several of his substantial policy positions having their roots in flippant remarks. At a Vancouver barbecue around this time last year, Trudeau offered an impromptu declaration that he was in favor of legalized marijuana, and the statement has been the stuff of endless Tory attack ads ever since.

His opponents’ glee is warranted. A recent Ipsos poll commissioned by the Justice Department revealed Canadians’ views on pot have yet to reach consensus. While legalization remains the plurality preference, with around 37% in favor, 33% back the milder course of merely decriminalizing the drug — essentially the de facto reality today — while about a quarter want things to either explicitly stay the same or for punishments to get harsher.

Such numbers expose a lingering apprehension about the proper place for pot in our society, yet Trudeau’s position is indifferent to subtlety. By the standards of the status quo, it is, in fact, an extreme position, and if Justin is embarrassed that it’s drawing the loud endorsement of other extremists — like flamboyant pot advocates Marc and Jodie Emery — he has only himself to blame. Post-hoc attempts at moderation, couched in assurances that his real marijuana agenda is simply to “regulate and control” a problematic substance — whose use, he is now quick to scold, has actually increased under Harper’s rule —  merely carry an odor of insecurity.

The pot numbers bring to mind a similar recent poll on abortion, which revealed no less public division. 13% believe the procedure should be banned in all or virtually all circumstances, 24% want it very tightly limited, while 52% want no legal restrictions whatsoever. In the face of such nuance, Trudeau again wields a blunt weapon — a complete ban on pro-life candidates within his party. Even in a country as famously anxious about “re-opening the abortion debate” as Canada, the extreme premise that pro-life politicians should simply not exist generates upwards of 70% disapproval. The Harper Tories, whose believe anything/do nothing abortion stance has historically delighted no one, suddenly appear thoughtful and pragmatic.

Nuance was equally absent in Trudeau’s response to last week’s big mosque visit brouhaha, in which Sun News — the conservative network I work for — aired a series of stories noting that in 2011 Justin campaigned at a radical Montreal mosque with al-Qaeda ties.

The Conservatives attacked, but questions were quickly raised. The fact that the mosque had served as a terrorist recruiting post was not widely known until the New York Times broke the story in April of 2011 — a full month after Trudeau’s visit. And even then, it was claimed the mosque’s most vigorous phase of al-Qaeda recruiting occurred during the 1990s. Critics cried cheap shot.

But just as the cover-up is often worse than the crime, it was Trudeau’s excuse that proved worse than the gaffe. Asked to justify his decision to stage a partisan event at the only North American mosque on an official Pentagon watch list, Justin was without apology, and could only chauvinistically sniff that “the US is known to make mistakes from time to time.” On the charge the place was a hotbed of extremism, he smirked that respecting diversity means “you don’t just speak to people who agree with you,” adding “I’m somewhat different from Mr. Harper in that measure.”

The Tories immediately drew analogies to the Liberal boss’ similarly blasé, politically-correct reaction to the Boston Marathon bombings, but the comments were troubling for reasons beyond national security. Regardless of what terror ties existed at the time of his visit, hateful religious fundamentalism of even the non-violent variety should surely test the limits of what any self-respecting progressive politician is prepared to defend in the name of “diversity.”

In winning three back-to-back elections, the Harper Tories have displayed great skill appealing to voter inclinations usually ignored by traditional mythologies of “progressive Canada” — deep apprehension about wild swings in social policy, and even deeper skepticism towards those who equivocate on matters of crime and wickedness in the name of enlightened thinking.

To Harper’s caution and forbearance, Trudeau’s radicalism offers voters an unprecedented alternative. But it’s probably unprecedented for a reason.

Romney 2016

For a land of supposedly endless second chances, it’s striking to notice the viciously unforgiving nature of one of America’s most entrenched political traditions: if you lead your political party to defeat in a presidential election, you never get another try.

Slow news summer that it is, one of the manufactured controversies of the moment is whether Mitt Romney could rise from the ashes of his failed 2012 presidential run to emerge a credible contender in 2016. Former advisors and donors seem to be doing their best to fan the flames, writing editorials, organizing petitions, and leaking choice quotes to anyone that will listen that their guy still has one more fight in him, if only we’d give him a push.

Sounds great to me. Regardless of who deserves the blame, much of what Romney predicted a second term of Obama would bring has been brought, from the post-pullout collapse of Iraqi security to ongoing Obamacare woes to no-end-in-sight dithering on the Keystone pipeline. With 53% of Americans said to want a do-over of 2012, simply announcing “you still can” seems like a pretty compelling pitch at this point. As Allahpundit observes, the I-told-you-so campaign ads “write themselves.”

Yet aside from the man’s personal distaste for another long, expensive slog through the campaign muck (a slog, as the documentary Mitt vividly illustrates, the candidate himself wasn’t even particularly giddy about the last go-round), the lead obstacle to such a supremely rational Republican strategy seems to be American political culture’s entrenched stigmatization of failed presidential nominees.

From where this tradition emerged is hard to say. American history is rife with presidents seeking non-consecutive terms after a loss four years earlier, including Martin Van Buren, Teddy Roosevelt, and Grover Cleveland — who actually won one. In more recent times, we all know the story of Richard Nixon losing to JFK in 1960 only to win renomination — and the White House — in 1968. And obviously there’s scant taboo against seeking a party’s presidential nomination more than once; the vast majority of post-war presidents and vice presidents, in fact, have launched at least one unsuccessful primary bid for their party’s nod prior to getting it.

My own theory is that a series of devastating losses for a succession of weak presidential nominees in the late 20th century — Mondale in 84, Dukakis in ’88, Dole in ’96 — helped solidify a trope of the “presidential loser” (portrayed hilariously in this Futurama scene), in which failed candidates are just fundamentally pathetic, hapless characters.

This runs contrary to the political culture of most other western democracies, in which losers can, and do, lead their parties to multiple defeats before eventually eking out a win.

Canada’s Stephen Harper lost his first bid for prime minister in the country’s 2004 general election, but retained his party’s support to make a second go in 2006, where he won. The new conservative prime minister of Australia, Tony Abbott, similarly lost his first bid for power in 2010 before winning in 2013. Ditto for Israel’s Benjamin Netanyahu, who was his party’s candidate for prime minister five times, but only won three.

The advantage of multiple runs is obvious: a familiar face means less time is spent campaigning on biography and resume — the dreaded “introducing yourself to voters” — and more time on issues. Voters might be sick of you, sure, but that fatigue is not without strategic benefit: having “heard it all before” applies to insults as well as slogans. Just as criticisms about President Obama’s socialism and hidden agendas seemed stale in 2012, so too would the tired tropes of Romney as an out-of-touch aristocrat be pretty boring in 2016. Indeed, Romney could take particular comfort from the fact that many of the overseas conservative leaders cited above were deemed “too right-wing” during their first run only to have that charge seem considerably less frightening after another couple of years with the progressives in charge.

It’s possible the GOP could do better than Romney in 2016, but it’s equally likely they could do a lot worse, too. He’s certainly a man with strategic and ideological flaws worth considering, but the fact that he ran and lost four years ago shouldn’t be one of them.

 

Iraq’s Parliamentary Problem

Recent coverage of Iraq’s internal breakdown has focused mostly on the rampaging horror of ISIL, and rightfully so. But the comparatively drier story of the political decay of Prime Minister Nuri al-Maliki is a tale inseparably linked to that same violence —  or at the very least, the American response to it.

In his recent New York Times interview, President Obama specifically linked his restrained bombing campaign of select ISIS targets with a desire to keep Maliki weak and unpopular. I’m not going to use American power to “bail out” a flaling government, he said, noting that the United States will not be a firm ally of any prime minister until they prove they’re “willing and ready to try and maintain a unified Iraqi government that is based on compromise.”

Understanding the inability of the Iraqi political class to fulfill this demand is a story of the failure of Iraq’s parliamentary political institutions.

The post-Saddam Iraqi constitution gave the country a parliamentary system moulded in traditional European fashion. It featured a party-list based electoral system, a figurehead president appointed by parliament, and an executive prime minister selected from among the factions of the legislature.

In 2005, the year of Iraq’s first general election, a formal alliance of Shiite parties, led by Dawa, an Iranian-backed ex-terrorist group, won a strong plurality of seats, and after months of negotiations with Kurdish and Sunni parties — whose votes were needed for an outright majority — Dawa deputy leader Nuri al-Maliki was confirmed as prime minister (the party’s actual boss, Ibrahim al-Jaafari, deemed too religiously dogmatic).

In elections five years later, Maliki’s Shiite coalition narrowly lost its plurality to the secular, pro-western party of longtime Bush administration darling Ayad Allawim. Yet Maliki was able to stay prime minister by forging a parliamentary alliance with a smaller, more extreme Shiite faction led by a clique of fundamentalist clerics including the now long-forgotten Moktada al-Sadr. This was controversial at the time, but it was consistent with the generally understood parliamentary custom that the incumbent PM should get first crack at forming a coalition government post-election — a precedent ultimately upheld by the Iraqi courts.

Though he had originally come to power with multi-denominational backing, the longer Maliki remained in power, the more brazenly sectarian his government became. This was largely a byproduct of his country’s worsening Sunni-Shiite civil war. A life-long Shiite partisan, Maliki had little qualms about using his position as commander-in-chief to deploy grossly disproportionate violence to crush suspected hotbeds of Sunni extremism (emphasis on suspected) or purge suspicious Sunnis from senior positions in the military, intelligence service, bureaucracy, and cabinet.

Those who expected this dark legacy of division, bloodshed, and favoritism to eventually be rejected by voters were shocked when Maliki’s coalition was able to regain its parliamentary plurality during elections held in April of this year. The Obama administration seemed particularly crestfallen.

Yet good news of a sort arrived this weekend when a fresh procedural bombshell was dropped — word came that Iraq’s president had requested Dawa’s deputy leader, Haider al-Abadi, to assume the prime ministership in Maliki’s place.

Under the terms of the Iraqi constitution, this was within the president’s prerogative — like a constitutional monarch, the Iraqi president is supposed to formally summon the leader of parliament’s “largest bloc” to assemble a government, with Article 76(iii) granting him the additional power to nominate someone else if the initial nominee is unable to get things together within 45 days.

But Malaki had not formally passed that deadline. Despite the fact that this most recent election was held over three months ago, the countdown for assembling a government does not begin until election results are ratified and parliament formally appoints a president —  which only happened on July 24. Malaki is also quite indisputably still leader of parliament’s “largest bloc;” members of his coalition have denounced the president’s alternative guy as representing “no one but himself.” Maliki, for his part, has dubbed the whole thing a “coup,” and some are predicting the constitutional standoff may result in a complete collapse of Iraqi political authority at the moment the country needs it most.

It is, of course, naive to blame any country’s political dysfunction entirely on the system of government they use. Yet it’s hard to deny Iraq’s preexisting political problems have likely been exacerbated by the country’s decision to adopt a complex, European-style parliamentary model, with a proportional representation electoral system that incentivizes politicians who appeal to trans-geographic religious identities and an executive branch that produces rulers who owe their power to a mastery of parliamentary maneuvering, rather than broad-based popular approval.

Had Iraq instead chosen to adopt a blunter presidential system — with a strong executive president elected by multiple-round popular vote and a separately-elected parliament from which ministers could be chosen — many of the country’s problems would doubtless still exist, and possibly even some new ones. Yet the fundamental question of who gets to rule the country would have been far less ambiguous and contestable, and the creation of a unity cabinet much faster and easier.

If Iraq’s political authority does completely break down in coming weeks, the temptation will be strong to insist its people were “never ready” for democracy, and declare the experiment failed. Yet democracy comes in many flavors and the taste Iraqis were given was a decidedly acquired one.

Unfortunately it’s probably too late to try another.

 

The media mess blame game

There was a clever cartoon in the San Francisco Gate some years ago, drawn by the hilarious (and unjustifiably obscure) Don Asmussen. It depicted a newspaper blaring the timely headline: “MEDIA SHOCKED BY DECLINE OF MEDIA — ‘IS THIS THE END OF MEDIA?’ ASKS MEDIA.”

The slow decay of mainstream journalism into a decrepit, profit-hemorrhaging husk is supposed to be one of the great tragedies of our time, and one that’s supposed to provoke media people to produce no shortage of opinions, theories and — most importantly — blame to fling around. That the media may not offer the most objective analysis of this question seems rarely contemplated.

I recently listened to an episode of the Canadaland podcast — which offers weekly media-asks-media analysis of this country’s crumbling journalism scene — about the fall of a short-lived Toronto weekly known as The Grid. The magazine, we were told, “did everything right” but still flopped financially. We were told this by interviews with writers and editors who used to work there, who of course were thoroughly convinced of their own brilliance and competence. There was zero conversation with anyone representing the public, which was a tad odd, as the magazine’s financial failings were explicitly due to unprofitable advertisements, which presumably indicates at least some trouble with audience engagement. Instead, fingers were pointed at the traditional hazy devils: management, technology, “trends.”

There is a legitimate concern that journalists are creating what Marxist-types would call a class ideology; a collection of defenses for self-interested behavior disguised in the language of morality. The idea that the stories that matter the most are the stories the reporter subculture most enjoys reporting on, for instance. Or that journalistic morality should be forever defined by whatever standards are being used right now.

Having increasingly little power to justify, these ideological tropes now merely constrain journalists’ ability to accurately diagnose their own plight, and dream up viable cures.

John Oliver’s recent viral rant against “Native Advertising” was revealing. At precisely the time folks like The Grid team are bemoaning an advertising-based revenue model that’s failing to deliver the goods, tastemakers of the Official Ideology are waging a furious propaganda war against incredibly lucrative new techniques.

Native advertising is basically just a 2.0 name for “advertorial” content, or an advertisement that takes the form of semi-disguised written copy. It’s not a terribly new practice, nor is it particularly sinister. Two of Oliver’s most horrified examples were a Buzzfeed article about cleaning technology written by Swiffer and a New York Times piece on women’s prisons by the Orange is the New Black people. Yet such mildness is nevertheless denounced as representing a profound existential threat to all that’s right and principled about the journalist’s craft, making those who collaborate “whores” or worse. (One wonders if there were similar conniptions the first time someone suggested printing advertisements in newspapers, period.)

But if media pride has atrophied their skills at what Orwell dubbed the “constant struggle” of seeing what’s before one’s nose, there appears an equally powerful impulse on the part of consumers to abdicate responsibility as well, through lazy populist righteousness that’s no less ideologically destructive.

My friend Graham wrote a fine essay on Medium the other day lambasting an entitled and hypocritical reader class who constantly demand quality journalism, yet consistently resist purchasing online subscriptions, and indeed, go one step further and install ad-blockers to prevent themselves from even inadvertently providing the necessary revenue to finance this want. Graham chalks this up to brazen cognitive dissidence, but I’d also blame the convenient myth of a biased, superficial media that gets trotted out every so often to justify consumer apathy. This chart by I Fucking Love Science, for instance, which purports to show all the stuff “the media doesn’t cover” is quite obviously straw man nonsense, yet such ritualistic denunciations of a supposedly “celebrity obsessed” press etc., provide a necessary veneer of principle to an otherwise entirely selfish abdication of public responsibility.

As is the case with most troubling societal trends, I’m convinced our current media troubles are mostly cultural in root, and demand cultural solutions.

At the very least, self-flattery will get us nowhere.




Archives





  • Recent Posts

  • Cartoon Archives