News






 

Murdered Aboriginal Women and the Politics of Moral Panic

At one time, the left was rightly skeptical of conservatives who sought to manufacture public hysteria as a means to achieving their ideological ends. Today, they’ve become masters of the craft.

Canada is currently in the midst of something of a fashionable fluster over the plight of “missing and murdered indigenous women,” a five-word phrase that’s become a ubiquitous part of the Canadian socio-political vocabulary, complete with its own hashtag. There now exists an archipelago of institutions devoted to the #MMIW cause, which presents itself as one of modern Canada’s darkest crises. The common refrain that the federal government has an obligation to commission a report on the phenomenon reached a fevered pitch this week as the various premiers of the Canadian provinces unanimously threw their weight behind the idea.

The only problem? There is no national epidemic of missing or murdered aboriginal women in Canada. Or at the very least, this exceedingly specific worry is not supported by any exceedingly specific data.

According to RCMP statistics, the percentage of female Canadian murder victims possessing aboriginal ancestry has remained largely constant over the last three decades, at around 14%. While this figure is extraordinarily high given that aboriginal women only comprise about 4% of the country’s female population, such victim overrepresentation is no less true of aboriginal men, whose disproportionate murder rate is an even larger statistical outlier (17%). Likewise, since most aboriginal murder victims die at the hands of other aboriginals — usually family, friends, or lovers — such high victim rates largely reflect the fact that aboriginals of both genders simply murder more, period.

These are dark statistics, but they illustrate a broad phenomenon, rather than a narrow one — aboriginal Canadians are the most homicide-prone demographic in Canadian society at the moment, and their violence is mostly internal. This, in turn, is the predictable outcome of the fact that aboriginals disproportionately suffer from the well-known social pathologies that make Canadians of all races more likely to murder and be murdered — broken homes, drug and alcohol abuse, poverty, lack of education, lack of opportunity in general, and so on. Summoning Ottawa to investigate aboriginal murders would be akin to asking the government to investigate murder in general. Canada’s grotesquely high aboriginal homicide rate can be explained succinctly: the same reasons, only more so.

But if the stats are so unambiguous, from where did the panic about missing and murdered indigenous women arise?

Much of it seems born from the legacy of the 2007 trial of the hideous Vancouver serial killer Robert Pickton, who’s estimated to have slaughtered as many as 49 prostitutes at his pig farm during the 1990s. Pickton’s arrest offered a grim conclusion to what had previously been one of the city’s great mysteries — why had so many prostitutes gone missing over the years — and a related class-critique follow up — why wasn’t more effort being exerted to find them? A 2012 report commissioned by the British Columbia government concluded the predictable: law enforcement was indifferent to the underclass.

About a third of Pickton’s 33 identified victims were aboriginal, a fact which probably owed more to the disproportionate presence of aboriginal women in the Canadian sex trade than anything else. Yet in the aftermath of his arrest, a narrative emerged that the broad societal nonchalance to his victims’ disappearances and deaths offered a useful case study of the particular plight of Canada’s native women, while the murders were symptomatic of a predatory white patriarchy prone to viewing indigenous women as subhuman and expendable.

Aboriginal and feminist activists had an obvious interest in pushing this perspective, yet as is the case with most serial murderers, it was always a stretch to claim the Pickton phenomenon was representative of anything beyond his own idiosyncratic evil. Under ordinary circumstances, the RCMP estimates 88% of aboriginal murder victims have their cases promptly solved by the police — a rate indistinguishable from victims of other races — and instances of white-on-Native violence, to say nothing of the spectacularly sociopathic sort, remain rare in Canada.

The rates at which which aboriginal men and women are killing each other in this country is a national disgrace, and, as the prime minister recently declared to absurd controversy, a matter that deserves the full focus of the criminal justice system. But to reframe the status quo through narrow narratives of neo-colonialist misogyny is to bury an unsensational problem of generic criminality beneath trendy academic ideologies of gender, race, and privilege — largely for the benefit of those who truck in such theories.

To knowingly peddle distractions on the pretence of seeking justice for victims is to commit a tremendous disservice to both.

Pastafarians and Progressivism

I was doing a hit on Sun News this morning (“doing a hit” is what we media bigshot types call “going on a show”) and the guy hitting after me was Obi Canuel, a devotee of the Church of the Flying Spaghetti Monster. Obi’s currently waging a quixotic — and highly media-friendly — battle with the British Columbia DMV to let him take his drivers’ license photo with a colander on his head, as his “faith” demands.

I spoke a bit with Obi after the fact and found him to be a good-natured, gentle fellow, if perhaps a tad naive. I asked him about the support he’s getting for his crusade and he got sort of wide-eyed. “To be honest,” he said, “some of my supporters are pretty… racist.” To which I wanted to reply, “well, duh.”

The whole Spaghetti Monster thing is an interesting metaphor, and not just in the way the Spaghetti Monster people themselves want.

Pastafarianism, as the quote-unquote “religion” is known, was initially just one more way for smug liberal atheists to have fun mocking fundamentalist Christians, this time via the magic of Dada absurdism. The Flying Spaghetti Monster church believes global warming is caused by pirate ships and rejects the theory of gravity in favor of “Intelligent Falling.” It’s not hard to grasp what they’re going for with this.

Folks like Obi, however, take things a step further. By insisting the state recognize their right to wear colanders on their heads in some obtrusive context, the teasing shifts away from the politics of evangelical right-wingers and towards the well-known headgear accommodation demands of Muslims and Sikhs. This is more delicate territory.

There are a lot of Canadians who believe government concessions for religious minorities — what the Quebeckers refer to as the doctrine of “reasonable accommodation” — have tilted too far in favor of third world immigrants and their unapologetically exotic customs. Customs rooted in religious or cultural assumptions that threaten to disrupt or undermine the values of Canada’s majority, such as extraordinarily draconian codes of modesty or demands of female subordination. I don’t consider this kind of attitude “racist” per se, but in the sense we now carelessly slap that label on anyone possessing any anxiety about minority behavior, it’s obvious why “racists” of this sort would find common cause with a guy making biting mockery of a famous minority demand.

As I noted in an earlier essay, I believe one of the defining political cleavages of coming decades will be a growing tension between the pro-science left and the post-modern left. This is because 21st century progressives have yet to definitively decide what they’re all about — embracing the hard answers of  biophysical reality, or the non-judgmental acceptance of all identities, behavior, and beliefs as equally valid.

Proud atheism is widely demanded as evidence of one’s seriousness as a impeccably rationalistic thinker, a virtue held in highest regard by leftists who want to place data, technology, and centralized planning at the core of policymaking. Religious argument has no place in this world; indeed, it’s seen as the stuff of extraordinarily counter-productive ignorance and superstition.

But while secular liberals are generally comfortable leveling such harsh words against white Christians, whose privileged standing in society is taken as self-evident, there is of course no shortage of multicolored overseas faithful who believe “anti-science” things just as hard, and their domestic presence is steadily increasing thanks to an aggressively multicultural immigration policy. For the progressive, this posits a crisis: one can criticize minority religions with the same vigor and fire heretofore used to bash Christians, and thus flirt with “racism,” or give minority religions a pass in the name of respecting diversity, and flirt with anti-scientific superstition.

The notion that the left can maintain a permanent ideological coalition of people inherently tolerant of all cultures and inherently critical of religious justifications for public behavior seems inherently fraught. I don’t know what strategy could possibly be employed to keep these two sides, with their wildly distinct notions of “truth,”on amicable terms.

But as a conservative, I guess it’s not really my problem.

 

Ferguson Biases

The Ferguson story is complicated, but its reception is not. We do not know how or why or under what circumstances young Michael Brown was killed by officer Darren Wilson, but we do know, in astonishingly precise detail, how we feel about it.

To say the Ferguson tragedy is being “politicized” isn’t entirely accurate, since (as of now) few figures of high politics have attempted to exploit its heated aftermath for partisan gain— with the possible exception of the ever-scrounging Rand Paul. But most analysis of the episode has certainly been reflective of American political culture, and the various interests and assumptions that increasingly define it.

This includes:

  • The maintenance of perpetually adversarial relations among racial groups through endless stoking of fear, suspicion and grievance;
  • The dismissal of much of established authority as illegitimate, with power that derives solely through force and oppression, usually exercised with extreme prejudice;
  • The inflation of shameful incidents into symptoms of larger “disturbing trends” to shame and frighten the public into a state of moral panic;
  • Conspiracy theories that “the media” works as a unified force to actively spread misinformation and sire hate;
  • Notions that “justice” is not a neutral evidence-based process, but a retributive weapon to be wielded against those we believe to have done wrong.

Such attitudes are the essence of Colbertian “truthiness” — the idea that realities about the world should derive from plausibility, rather than fact — and if left to fester, will rot any civilization in which they’re tolerated. That what happened to Michael Brown is undeniably a tragedy does not excuse doubling-down on destructive ideological clichés any more than it excuses rioting or preposterous police militarization.

America, quite simply, does not have a police-murdering-innocent-people problem. It may well have a disproportionate arrest of African-Americans problem (an at least partial outgrowth of its disproportionate number of crimes committed by African-Americans problem) and even a police brutality problem, but to argue Michael Brown’s death was in any way representative of some greater phenomenon— as many activists are doing — is to ignore precisely what made his killing newsworthy in the first place. Namely, its rarity.

As NPR reported last month, of the over 98 million arrests made in the United States over the last seven years, around .005% of them resulted in what the Justice Department deemed a “homicide.” Even if 100% of all such victims were black, which they obviously are not, it remains an occurrence so freakishly obscure it resists much use as a valuable statistic of larger social trends. Larger social trends, incidentally, reveal massively declining black youth arrest rates.

Likewise, though the #ifiwasgunneddown hashtag has prompted much sympathetic nodding of heads, there’s equally scant evidence the “press” as some collective entity plots to present the public image of black victims of violence in the worst possible light. On the contrary, in the Treyvon Martin case, much greater controversy was had over the media not portraying him thuggishly enough, and instead erring on the side of progressive political correctness by using an outdated, innocuous photo.

It should also be said — and it’s sad that this is even an insightful thing to note these days — no one really knows what happened to Michael Brown. As I write this, the Chief of the Ferguson police department has just released security cam footage of Michael Brown minutes before his death, in which he can be seen stealing a box of cigars from a convenience store and roughing up its clerk. This is of course still miles away from an acceptable justification for killing an unarmed youth, but it does remind that these sorts of stories often involve a lot of moral gray, with facts that refuse to neatly conform to the good vs. evil morality play many desire.

A peaceful society is born from one whose citizens treat each other with tolerance and compassion, and where the disciplinary powers of state authority are exercised with sensitivity and restraint. But so too does peace require a populace whose reality is shaped by observation and fact, not hostile suspicion and pre-determined conclusions.

To cling without proof to assumptions of the most dark and cynical sort is to make ourselves not merely intellectual prisoners of our worst fears and biases, but architects of the very world we claim to want to avoid.

Trudeau’s Promised Extremism

Whether a politician can harm himself veering too far to the social left is not a question we’re used to contemplating, destructive extremism on issues like abortion and Islam being traditionally understood as dysfunctions of the right. Yet if Canada’s Conservatives have their way, the dangers of unchecked social progressivism will be one of the defining themes of the country’s 2015 general election, which will pit the square, bourgeois sensibilities of Prime Minister Stephen Harper against the dramatic permissiveness of Liberal leader Justin Trudeau.

Young and peppy, Trudeau’s energetic spontaneity can be an obvious asset, but it’s also resulted in several of his substantial policy positions having their roots in flippant remarks. At a Vancouver barbecue around this time last year, Trudeau offered an impromptu declaration that he was in favor of legalized marijuana, and the statement has been the stuff of endless Tory attack ads ever since.

His opponents’ glee is warranted. A recent Ipsos poll commissioned by the Justice Department revealed Canadians’ views on pot have yet to reach consensus. While legalization remains the plurality preference, with around 37% in favor, 33% back the milder course of merely decriminalizing the drug — essentially the de facto reality today — while about a quarter want things to either explicitly stay the same or for punishments to get harsher.

Such numbers expose a lingering apprehension about the proper place for pot in our society, yet Trudeau’s position is indifferent to subtlety. By the standards of the status quo, it is, in fact, an extreme position, and if Justin is embarrassed that it’s drawing the loud endorsement of other extremists — like flamboyant pot advocates Marc and Jodie Emery — he has only himself to blame. Post-hoc attempts at moderation, couched in assurances that his real marijuana agenda is simply to “regulate and control” a problematic substance — whose use, he is now quick to scold, has actually increased under Harper’s rule —  merely carry an odor of insecurity.

The pot numbers bring to mind a similar recent poll on abortion, which revealed no less public division. 13% believe the procedure should be banned in all or virtually all circumstances, 24% want it very tightly limited, while 52% want no legal restrictions whatsoever. In the face of such nuance, Trudeau again wields a blunt weapon — a complete ban on pro-life candidates within his party. Even in a country as famously anxious about “re-opening the abortion debate” as Canada, the extreme premise that pro-life politicians should simply not exist generates upwards of 70% disapproval. The Harper Tories, whose believe anything/do nothing abortion stance has historically delighted no one, suddenly appear thoughtful and pragmatic.

Nuance was equally absent in Trudeau’s response to last week’s big mosque visit brouhaha, in which Sun News — the conservative network I work for — aired a series of stories noting that in 2011 Justin campaigned at a radical Montreal mosque with al-Qaeda ties.

The Conservatives attacked, but questions were quickly raised. The fact that the mosque had served as a terrorist recruiting post was not widely known until the New York Times broke the story in April of 2011 — a full month after Trudeau’s visit. And even then, it was claimed the mosque’s most vigorous phase of al-Qaeda recruiting occurred during the 1990s. Critics cried cheap shot.

But just as the cover-up is often worse than the crime, it was Trudeau’s excuse that proved worse than the gaffe. Asked to justify his decision to stage a partisan event at the only North American mosque on an official Pentagon watch list, Justin was without apology, and could only chauvinistically sniff that “the US is known to make mistakes from time to time.” On the charge the place was a hotbed of extremism, he smirked that respecting diversity means “you don’t just speak to people who agree with you,” adding “I’m somewhat different from Mr. Harper in that measure.”

The Tories immediately drew analogies to the Liberal boss’ similarly blasé, politically-correct reaction to the Boston Marathon bombings, but the comments were troubling for reasons beyond national security. Regardless of what terror ties existed at the time of his visit, hateful religious fundamentalism of even the non-violent variety should surely test the limits of what any self-respecting progressive politician is prepared to defend in the name of “diversity.”

In winning three back-to-back elections, the Harper Tories have displayed great skill appealing to voter inclinations usually ignored by traditional mythologies of “progressive Canada” — deep apprehension about wild swings in social policy, and even deeper skepticism towards those who equivocate on matters of crime and wickedness in the name of enlightened thinking.

To Harper’s caution and forbearance, Trudeau’s radicalism offers voters an unprecedented alternative. But it’s probably unprecedented for a reason.

Romney 2016

For a land of supposedly endless second chances, it’s striking to notice the viciously unforgiving nature of one of America’s most entrenched political traditions: if you lead your political party to defeat in a presidential election, you never get another try.

Slow news summer that it is, one of the manufactured controversies of the moment is whether Mitt Romney could rise from the ashes of his failed 2012 presidential run to emerge a credible contender in 2016. Former advisors and donors seem to be doing their best to fan the flames, writing editorials, organizing petitions, and leaking choice quotes to anyone that will listen that their guy still has one more fight in him, if only we’d give him a push.

Sounds great to me. Regardless of who deserves the blame, much of what Romney predicted a second term of Obama would bring has been brought, from the post-pullout collapse of Iraqi security to ongoing Obamacare woes to no-end-in-sight dithering on the Keystone pipeline. With 53% of Americans said to want a do-over of 2012, simply announcing “you still can” seems like a pretty compelling pitch at this point. As Allahpundit observes, the I-told-you-so campaign ads “write themselves.”

Yet aside from the man’s personal distaste for another long, expensive slog through the campaign muck (a slog, as the documentary Mitt vividly illustrates, the candidate himself wasn’t even particularly giddy about the last go-round), the lead obstacle to such a supremely rational Republican strategy seems to be American political culture’s entrenched stigmatization of failed presidential nominees.

From where this tradition emerged is hard to say. American history is rife with presidents seeking non-consecutive terms after a loss four years earlier, including Martin Van Buren, Teddy Roosevelt, and Grover Cleveland — who actually won one. In more recent times, we all know the story of Richard Nixon losing to JFK in 1960 only to win renomination — and the White House — in 1968. And obviously there’s scant taboo against seeking a party’s presidential nomination more than once; the vast majority of post-war presidents and vice presidents, in fact, have launched at least one unsuccessful primary bid for their party’s nod prior to getting it.

My own theory is that a series of devastating losses for a succession of weak presidential nominees in the late 20th century — Mondale in 84, Dukakis in ’88, Dole in ’96 — helped solidify a trope of the “presidential loser” (portrayed hilariously in this Futurama scene), in which failed candidates are just fundamentally pathetic, hapless characters.

This runs contrary to the political culture of most other western democracies, in which losers can, and do, lead their parties to multiple defeats before eventually eking out a win.

Canada’s Stephen Harper lost his first bid for prime minister in the country’s 2004 general election, but retained his party’s support to make a second go in 2006, where he won. The new conservative prime minister of Australia, Tony Abbott, similarly lost his first bid for power in 2010 before winning in 2013. Ditto for Israel’s Benjamin Netanyahu, who was his party’s candidate for prime minister five times, but only won three.

The advantage of multiple runs is obvious: a familiar face means less time is spent campaigning on biography and resume — the dreaded “introducing yourself to voters” — and more time on issues. Voters might be sick of you, sure, but that fatigue is not without strategic benefit: having “heard it all before” applies to insults as well as slogans. Just as criticisms about President Obama’s socialism and hidden agendas seemed stale in 2012, so too would the tired tropes of Romney as an out-of-touch aristocrat be pretty boring in 2016. Indeed, Romney could take particular comfort from the fact that many of the overseas conservative leaders cited above were deemed “too right-wing” during their first run only to have that charge seem considerably less frightening after another couple of years with the progressives in charge.

It’s possible the GOP could do better than Romney in 2016, but it’s equally likely they could do a lot worse, too. He’s certainly a man with strategic and ideological flaws worth considering, but the fact that he ran and lost four years ago shouldn’t be one of them.

 

Iraq’s Parliamentary Problem

Recent coverage of Iraq’s internal breakdown has focused mostly on the rampaging horror of ISIL, and rightfully so. But the comparatively drier story of the political decay of Prime Minister Nuri al-Maliki is a tale inseparably linked to that same violence —  or at the very least, the American response to it.

In his recent New York Times interview, President Obama specifically linked his restrained bombing campaign of select ISIS targets with a desire to keep Maliki weak and unpopular. I’m not going to use American power to “bail out” a flaling government, he said, noting that the United States will not be a firm ally of any prime minister until they prove they’re “willing and ready to try and maintain a unified Iraqi government that is based on compromise.”

Understanding the inability of the Iraqi political class to fulfill this demand is a story of the failure of Iraq’s parliamentary political institutions.

The post-Saddam Iraqi constitution gave the country a parliamentary system moulded in traditional European fashion. It featured a party-list based electoral system, a figurehead president appointed by parliament, and an executive prime minister selected from among the factions of the legislature.

In 2005, the year of Iraq’s first general election, a formal alliance of Shiite parties, led by Dawa, an Iranian-backed ex-terrorist group, won a strong plurality of seats, and after months of negotiations with Kurdish and Sunni parties — whose votes were needed for an outright majority — Dawa deputy leader Nuri al-Maliki was confirmed as prime minister (the party’s actual boss, Ibrahim al-Jaafari, deemed too religiously dogmatic).

In elections five years later, Maliki’s Shiite coalition narrowly lost its plurality to the secular, pro-western party of longtime Bush administration darling Ayad Allawim. Yet Maliki was able to stay prime minister by forging a parliamentary alliance with a smaller, more extreme Shiite faction led by a clique of fundamentalist clerics including the now long-forgotten Moktada al-Sadr. This was controversial at the time, but it was consistent with the generally understood parliamentary custom that the incumbent PM should get first crack at forming a coalition government post-election — a precedent ultimately upheld by the Iraqi courts.

Though he had originally come to power with multi-denominational backing, the longer Maliki remained in power, the more brazenly sectarian his government became. This was largely a byproduct of his country’s worsening Sunni-Shiite civil war. A life-long Shiite partisan, Maliki had little qualms about using his position as commander-in-chief to deploy grossly disproportionate violence to crush suspected hotbeds of Sunni extremism (emphasis on suspected) or purge suspicious Sunnis from senior positions in the military, intelligence service, bureaucracy, and cabinet.

Those who expected this dark legacy of division, bloodshed, and favoritism to eventually be rejected by voters were shocked when Maliki’s coalition was able to regain its parliamentary plurality during elections held in April of this year. The Obama administration seemed particularly crestfallen.

Yet good news of a sort arrived this weekend when a fresh procedural bombshell was dropped — word came that Iraq’s president had requested Dawa’s deputy leader, Haider al-Abadi, to assume the prime ministership in Maliki’s place.

Under the terms of the Iraqi constitution, this was within the president’s prerogative — like a constitutional monarch, the Iraqi president is supposed to formally summon the leader of parliament’s “largest bloc” to assemble a government, with Article 76(iii) granting him the additional power to nominate someone else if the initial nominee is unable to get things together within 45 days.

But Malaki had not formally passed that deadline. Despite the fact that this most recent election was held over three months ago, the countdown for assembling a government does not begin until election results are ratified and parliament formally appoints a president —  which only happened on July 24. Malaki is also quite indisputably still leader of parliament’s “largest bloc;” members of his coalition have denounced the president’s alternative guy as representing “no one but himself.” Maliki, for his part, has dubbed the whole thing a “coup,” and some are predicting the constitutional standoff may result in a complete collapse of Iraqi political authority at the moment the country needs it most.

It is, of course, naive to blame any country’s political dysfunction entirely on the system of government they use. Yet it’s hard to deny Iraq’s preexisting political problems have likely been exacerbated by the country’s decision to adopt a complex, European-style parliamentary model, with a proportional representation electoral system that incentivizes politicians who appeal to trans-geographic religious identities and an executive branch that produces rulers who owe their power to a mastery of parliamentary maneuvering, rather than broad-based popular approval.

Had Iraq instead chosen to adopt a blunter presidential system — with a strong executive president elected by multiple-round popular vote and a separately-elected parliament from which ministers could be chosen — many of the country’s problems would doubtless still exist, and possibly even some new ones. Yet the fundamental question of who gets to rule the country would have been far less ambiguous and contestable, and the creation of a unity cabinet much faster and easier.

If Iraq’s political authority does completely break down in coming weeks, the temptation will be strong to insist its people were “never ready” for democracy, and declare the experiment failed. Yet democracy comes in many flavors and the taste Iraqis were given was a decidedly acquired one.

Unfortunately it’s probably too late to try another.

 

The media mess blame game

There was a clever cartoon in the San Francisco Gate some years ago, drawn by the hilarious (and unjustifiably obscure) Don Asmussen. It depicted a newspaper blaring the timely headline: “MEDIA SHOCKED BY DECLINE OF MEDIA — ‘IS THIS THE END OF MEDIA?’ ASKS MEDIA.”

The slow decay of mainstream journalism into a decrepit, profit-hemorrhaging husk is supposed to be one of the great tragedies of our time, and one that’s supposed to provoke media people to produce no shortage of opinions, theories and — most importantly — blame to fling around. That the media may not offer the most objective analysis of this question seems rarely contemplated.

I recently listened to an episode of the Canadaland podcast — which offers weekly media-asks-media analysis of this country’s crumbling journalism scene — about the fall of a short-lived Toronto weekly known as The Grid. The magazine, we were told, “did everything right” but still flopped financially. We were told this by interviews with writers and editors who used to work there, who of course were thoroughly convinced of their own brilliance and competence. There was zero conversation with anyone representing the public, which was a tad odd, as the magazine’s financial failings were explicitly due to unprofitable advertisements, which presumably indicates at least some trouble with audience engagement. Instead, fingers were pointed at the traditional hazy devils: management, technology, “trends.”

There is a legitimate concern that journalists are creating what Marxist-types would call a class ideology; a collection of defenses for self-interested behavior disguised in the language of morality. The idea that the stories that matter the most are the stories the reporter subculture most enjoys reporting on, for instance. Or that journalistic morality should be forever defined by whatever standards are being used right now.

Having increasingly little power to justify, these ideological tropes now merely constrain journalists’ ability to accurately diagnose their own plight, and dream up viable cures.

John Oliver’s recent viral rant against “Native Advertising” was revealing. At precisely the time folks like The Grid team are bemoaning an advertising-based revenue model that’s failing to deliver the goods, tastemakers of the Official Ideology are waging a furious propaganda war against incredibly lucrative new techniques.

Native advertising is basically just a 2.0 name for “advertorial” content, or an advertisement that takes the form of semi-disguised written copy. It’s not a terribly new practice, nor is it particularly sinister. Two of Oliver’s most horrified examples were a Buzzfeed article about cleaning technology written by Swiffer and a New York Times piece on women’s prisons by the Orange is the New Black people. Yet such mildness is nevertheless denounced as representing a profound existential threat to all that’s right and principled about the journalist’s craft, making those who collaborate “whores” or worse. (One wonders if there were similar conniptions the first time someone suggested printing advertisements in newspapers, period.)

But if media pride has atrophied their skills at what Orwell dubbed the “constant struggle” of seeing what’s before one’s nose, there appears an equally powerful impulse on the part of consumers to abdicate responsibility as well, through lazy populist righteousness that’s no less ideologically destructive.

My friend Graham wrote a fine essay on Medium the other day lambasting an entitled and hypocritical reader class who constantly demand quality journalism, yet consistently resist purchasing online subscriptions, and indeed, go one step further and install ad-blockers to prevent themselves from even inadvertently providing the necessary revenue to finance this want. Graham chalks this up to brazen cognitive dissidence, but I’d also blame the convenient myth of a biased, superficial media that gets trotted out every so often to justify consumer apathy. This chart by I Fucking Love Science, for instance, which purports to show all the stuff “the media doesn’t cover” is quite obviously straw man nonsense, yet such ritualistic denunciations of a supposedly “celebrity obsessed” press etc., provide a necessary veneer of principle to an otherwise entirely selfish abdication of public responsibility.

As is the case with most troubling societal trends, I’m convinced our current media troubles are mostly cultural in root, and demand cultural solutions.

At the very least, self-flattery will get us nowhere.

Fresh battle lines being drawn in America’s culture wars

The culture war is dead. Long live the culture war.

It’s fashionable to observe that many of the most contentious social policy cleavages of the 1980s — when America’s “culture war” meme first went mainstream — are now the stuff of broad consensus.

Debates on the appropriate presence of public prayer have concluded in the minimalists’ favor. Universal legalization of same-sex marriage is perhaps a year away. Conservatives made peace with unwed motherhood to double-down on abortion  and paid for the strategic blunder — a large majority remains in favor of keeping the procedure legal in “all” or “certain” circumstances.

Yet as the previous decades’ debates wind down, fresh moral quandaries about the standards and values of American life emerge. Finding harmonious answers to these new dilemmas of tolerance, identity, and individualism will be a defining struggle of the millennial generation.

For as long as we’ve been fretting about discrimination it’s been said that one man’s innocent quip is another man’s slur. As overt bigotry becomes increasingly rare, it’s the innocent quips that now receive the hottest fire, zealousness having not moderated in conjunction with the lowering of stakes. Today’s tolerance activists speak of “microaggressions,” small indignities of language and manners, such as asking a visible minority where he “came from” or suggesting a woman carry the lighter box. Movies and television shows are now meticulously scrutinized using standards like the “Bechdel Test” to ensure females and minorities are portrayed in the most flattering, self-actualized fashion, with an equally fussy eye cast towards nouveau sins like “exoticism” and “othering.” Even perfectly post-modern public figures like Stephen Colbert and RuPaul have proven but a mere ill-utterance away from triggering shame campaigns from a self-appointed vanguard of “what’s not okay.”

This goal of a zero-tolerance culture is invariably at odds with unsuppressed freedom of expression and that which it produces: honest commentary, diverse storytelling, insightful humour, complexity of language and thought. The critics seek to restore the legitimacy of censorship, or at least self-censorship, in which an extra layer of nervous second-guessing must be applied to all intellectual and creative output in order to make ideas subordinate to a ruling ideology capable of punishing dissent.

Struggles over acceptable means of restraining intolerance are in many ways the outgrowth of a larger philosophical split over the nature of identity, and the privileges a citizen should claim through self-applied labels and group affiliation.

Transgender Americans have found success lobbying for legal inclusion as a class protected from open discrimination. Yet anxieties remain over the community’s existential thesis that gender itself is inherently fluid and subjective — an assertion for which the science is hardly settled, yet was recently decreed official fact by the school board of Vancouver. Today we have autistics who oppose efforts to cure their condition on the basis they were merely born different, not ill, a resistance to the “medicalization” of identity shared by an increasing assortment of unpopular demographics, including the obese (who dismiss BMI measurements as quackery), serious drug users (who euphemise their “abuse” as simply “misuse”), and schizophrenics (who self-identify as “multiples”).

Homosexuality was of course once considered a mental illness. That’s no longer the case today, yet whatever biological variables do explain the phenomenon remain ambiguous, and many are not eager to see things clarified. A faction that values the cultivation and preservation of diverse identities and stigmatizes efforts to assimilate or “fix” deviant behavior is destined to clash with those seeking definitive scientific explanations for life’s mysteries.

A similar cleavage divides those who fetishize the total supremacy of the individual against those who worry about behavior’s societal consequences. A friend of mine recently wrote an essay fretting about the sadomasochism renaissance prompted by Fifty Shades of Grey; he worried that a culture elevating individual “consent” to its highest good will be one thoughtlessly normalizing behavior that’s socially destructive in a broader sense — in this case, violence against women. Increasingly loud proposals to normalize other historic taboos — recreational drug use, prostitution, violent video games, child pornography — spawn similar concern. The anti-individualists ask at what point the pursuit of “harmless” personal pleasure corrupts the virtues of the larger society these individuals comprise. The individualist-supremacists flatly deny the possibility.

None of these tensions are particularly new, but the battlegrounds are decidedly 21st century. As opinions congeal around fresh struggles to balance choice, evidence, identity, and opinion, old understandings of the political divide are overthrown. They pose a particular threat to the progressive left, which may be doomed to split into warring post-modern and libertarian factions.

Divisive, ideological, and often personally threatening, the culture wars of the future will not be pleasant. But few wars are.

Lack of Pride

“When is Pride this year?” straight friends ask, voices raising with equal parts excitement and condescension. As a gay, I’m presumed to be well-versed in such things, but alas, there’s no easy answer. There exists no single “Pride,” after all, simply a long sequence of independent extravaganzas across North America, each observed on conveniently different dates.

Los Angeles Pride happened long ago for instance, on the traditional first weekend in June. San Francisco pride — the big one — has come and gone as well, occurring, as it does, on the last weekend of that same month. Vancouver Pride kicks off this Saturday, and Vegas Pride comes a month after that, in early September. Countless other cities are sprinkled somewhere in-between.

Such cleverly staggered scheduling, which allows the don’t-stop-the-music set to engage in summer-long “Pride tours” across the continent, hopefully helps illustrate the fundamental vacuousness of this would-be holiday. It’s certainly one of many variables justifying my profound disinterest in it. Despite being gay for as long as I can recall, I not only shun Pride, I actively resent the implication that attendance offers any meaningful indication of one’s GLBT acceptance.

I’ve always felt a bit of sympathy for Rob Ford’s various mumbled explanations of why he’s never attended Toronto Pride during his four years as mayor. His no-show status has of course been widely taken as proof of his supposed homophobia, but his official excuse — that he’s simply an old-fashioned guy who finds the garish flaunting of sexuality uncomfortable — seems perfectly reasonable. To modern elite opinion-makers, however, who have done so much to inflate Pride as the culture’s leading litmus test of tolerance, personal uneasiness is a sentiment so exotic it may as well be uttered in Swahili.

While I’m no prude — actually, strike that, I am a prude. And what of it? Flipping through online albums of Toronto Pride 2013, one finds ample documentation of S&M bondage couples, barely-there thongs, buttless chaps, and all manner of grinding, thrusting, jiggling, and twerking. It’s perfectly acceptable to find such things gross or distasteful, and an exploitive cheapening of both sex and the body.

It is no great character flaw to value modesty or dignity, nor is it bigoted to esteem forbearance and control. Libertine attitudes towards sex, nudity, fetishism, and exhibitionism are issues entirely disconnected from the civil rights matter of whether peoples of divergent sexual orientations are deserving of the same rights and protections of those in the majority. To argue the contrary is to claim possessing a minority sexual preference should be synonymous with sexual deviancy in general — a premise not only dated, but dangerous.

There was a clever Onion piece published more than a decade ago (I doubt such a thing would be written in this more sensitive age) headlined “Local Pride Parade Sets Mainstream Acceptance Of Gays Back 50 Years.”

“I thought the stereotype of homosexuals as hedonistic, sex-crazed deviants was just a destructive myth,” the paper quotes one horrified onlooker. “Boy, oh, boy, was I wrong.”

Sounds about right. Indeed, one has to wonder just how much comfort Pride is even attempting to offer the genuinely sexually conflicted at this point. Considering how much “coming out” anxiety tends to center around fears of lost normalcy, it’s not clear at all how declaring common cause with society’s most brazen display of freakshow non-conformity is a useful means to that end.

Looking at photos of North America’s earliest Pride parades is a window into a different world. The marchers of those days, calmly holding hands with their same-sex partners in sensible polo shirts and penny loafers, were certainly subversive, but only to the extent they were seeking to remind a society in denial of the unavoidability of their existence, and the bland, non-threatening nature of it. Theirs was a call for inclusion in the most literal sense, the welcoming of homosexuals into society’s most central institutions: family, work, religion, politics, and the acceptance of their love as valid as any other sort.

That goal having now largely been achieved, the Pride movement, like so much of the modern Gay Rights activist complex, has become a victim of its own success. As North Americans get used to people being here and queer, the moderate LGBT middle class has drifted away from leadership of the tolerance movement, allowing the wild fringe to fill the void. What results is a historical irony: just as society is most eager to assert its tolerance, Pride redefines the deal. Endorsing the acceptance of ordinary people distinguishable only by what gender they love now demands an additional stamp of approval for all-purpose indecency and licentiousness.

Politicians, corporations, and all manner of interest groups clamor to agree to the terms. But for an increasing lot of gays, it’s hardly obvious why we should care.

The limits of liberalism

Over the last couple of decades, a dominant narrative of North American politics has been the dangers of drifting too far to the right. From Tim Hudak’s doomed bid for the premiership of Ontario to the surprise defeat of the Wildrose party in Alberta to self-destructive Tea Party campaigns across the United States, the explanation for why so many conservatives can’t get it together appears obvious to most. To paraphrase Margaret Thatcher, right-of-center candidates are placing too much emphasis on the adjective and not enough on the preposition.

Far less contemplated these days is whether there is any negative cost to be incurred from drifting too far to the left, particularly now that progressives increasingly define themselves through boastful acceptance of previously-stigmatized personal behaviour.

The aspiring candidacy of Jodie Emery, Vancouver’s so-called “Princess of Pot” and spouse of recently-released Canadian drug lord Marc Emery may prove a revealing case study.

Mrs. Emery is currently seeking the Liberal nomination in the parliamentary riding of Vancouver East, and she’s hardly hidden the fact that her primary purpose in running is to advocate for the legalization of marijuana, the Emery family’s pet cause. Legalization of marijuana is a position favored by Liberal boss Justin Trudeau, but to suggest the two are on the “same side” of the issue is to betray its moral — and electoral — complexities.

Justin’s stance is essentially a utilitarian one: he sees legalization as a way to battle organized crime and liberate an overburdened criminal justice system. Yet he’s also described consumption of the drug as a “vice” with scant social virtue. His much-ballyhooed admission of prior use was heavily coached, and if not exactly remorseful, was certainly qualified and self-conscious. His legalization plan, though vague, has emphasized the importance of keeping cannabis far from children, and he’s bemoaned that in Harper’s Canada, it’s “easier for youth to access pot than alcohol or cigarettes.”

The Emerys have a slightly different perspective, to put it mildly. As editors of Cannabis Culture magazine, founders of “Pot TV,” proprietors of head shops and seed stores, and MCs of all manner of pot conventions and trade shows, the power couple lead a subculture a that views marijuana not as an unavoidable social sin whose ills must be minimized and controlled with compassionate legislation, but an undeniably positive product with virtues worth celebrating.

“Marijuana is so good! It does so much for so many!” Marc crows in a 2010 YouTube video (ironically devoted to blasting Justin Trudeau as a “f—cking hypocrite” for backing mandatory jail times for drug traffickers). “It’s brought us everything from music to technology to cutting-edge news services, major athletes, every form of entertainment and science and architecture…”

This stance, that pot consumption is harmless, and should be completely destigmatized — if not encouraged — is probably a great deal closer to the views of the Liberal base than their leader’s cautious policy of managing risk without endorsement. Yet Justin’s pragmatism is the result of having enough sense to appreciate that elections are not decided by base alone, but the electorate’s broad centre — those much-coveted middle class, suburban swing voters who remain unfashionably inclined to regard mind-altering substances as a destructive force poisoning the culture of their children and neighborhoods. Considering that Justin’s pot stance is already taking a tremendous drubbing in suburban-oriented Conservative attack ads, it’s unclear if there’s any electoral gain to be had by embracing a darling of the stoner set who lacks even a pretence of pragmatism. In a centralized parliamentary system like ours, it only takes a single rogue candidate to upset a leader’s carefully constructed nuance, and Justin — who’s already shown himself more than capable of torpedoing troublesome candidates — is surely asking himself if Jodie’s a risk worth taking.

A similar dilemma defines the Liberal relationship with legalized prostitution.

Mrs. Emery happily endorsed the idea on Sun News yesterday, declaring it a private, consensual business transaction not terribly different (of course) from the private, consensual business of buying pot. This too, is the sort of proudly permissive position held by much of the Liberals’ ideological base, who prefer to conceptualize the buying and selling of sex as a libertarian thought experiment or abstract goal of sexual liberation. Canadians in the cautious middle, alas, inclined as they are to contemplate problems from a less academic angle, are more likely to fret about whether legalization of prostitution will simply increase its presence in their communities, as legalization of banned things is wont to do.

Though critical of the Harper Government’s John-battling prostitution bill, Trudeau’s party has not embraced the cause of complete legalization, preferring, instead, to hide behind the time-wasting excuse that more research and consideration is needed to reach an informed conclusion. It’s an even more delicate position than his stance on pot, and an even more revealing reflection of his party’s anxieties about being defined by cavalier social policies rather than practical economic ones.

Though it’s easy to dismiss his leadership as entirely frivolous, Justin Trudeau possesses great importance in defining the limits of left-wing social policy at a time when many progressives are inclined to regard “going too far” as the exclusive disorder of the right.

Assuming her candidacy is serious, Jodie Emery will be the canary in the mine.




Archives





  • Recent Posts

  • Cartoon Archives