Impeachment Horror

Impeachment Horror

I recently finished reading Jeff Toobin’s A Vast Conspiracy, an epic 448-page chronicle of the Monica Lewinsky scandal, from its earliest beginnings as an obscure sexual harassment lawsuit in Arkansas to the second-ever impeachment of an American president. My interest was sparked by Monica’s recent and very thoughtful essay in Vanity Fair, which brought her decades-old story back into public conversation. The tale’s only become more timely since, now that talk of presidential impeachment (spurious or not) has reentered the headlines.

It seems the minute a president enters his second term partisan foes begin to chatter about whether he’s impeach-worthy. It’s a sentiment born partially from frustrated resentment (no one likes to lose twice to the same guy), partially from opportunism (the Congressional opposition almost always gains seats during a president’s first term), and partially from the White House itself, for whom rallying against an “impeachment obsessed” opposition can be of great material benefit.

So present rumblings over the possible impeachment of President Obama will probably only get louder in coming months. What lessons can today’s giddy Republicans learn from their predecessors’ failure?

First: have a clear-cut, impeachable offense.

It was never entirely clear why Clinton was being impeached, which allowed accusations it was “all about politics” or “all about sex” to fill the ambiguity.

Republicans furiously believed the Clinton White House was hopelessly corrupt, and Clinton himself embarrassing and immoral, yet they ultimately chose to impeach him for two incredibly narrow, legal offenses: lying to a grand jury about his affair with Monica during his deposition in the Paula Jones harassment suit, and obstructing justice by conspiring with Monica in various ways to ensure her corroborating silence.

Constitutional scholars generally agree that presidents can be impeached for just about anything, with the constitution’s vague criteria of “high crimes and misdemeanors” defined through centuries of English precedent to mean, in the famously glib words of Gerald Ford, “whatever a majority of the House of Representatives considers it to be.” Yet Toobin argues the 1990s heralded an era in which the judicial system “took over the political system” and it became received wisdom that political battles should be fought through lawsuits and litigation rather than traditional constitutional mechanisms. Republicans thus decided to impeach Clinton on the grounds he was a petty criminal, as opposed to simply unfit for office.

Second: have the numbers.

In contrast to the impeachment of Richard Nixon, which enjoyed some semblance of bipartisan support, every Congressional vote in the long slog to remove Bill Clinton was almost perfectly party-line.

This rank partisanship doomed Clinton’s impeachment from the get-go. Since the final vote in the process — the one that actually expels the president from office — requires a two-thirds majority in the Senate, even the GOP’s healthy majority in both chambers was not sufficient. Some Democrats had to get on board, but because Clinton’s impeachment was perceived as a hysterically ideological Republican plot (a “vast conspiracy,” if you will), none ever did. This was a direct byproduct of problem number one; because the formal argument for impeachment was confused and weak, it remained powerfully unpersuasive to the other side.

Third: have public support.

Perhaps the most famous factoid of the Clinton impeachment is that the President’s approval numbers actually went up during it. Such sympathy appears even more justified in retrospect; the “peace and prosperity” of the 90s remains enviable, and Clinton’s competence as a administrator, whatever his faults as a man, contrasts sharply with his successors.

Had the Republicans upheld the Founders’ intent, and sought to remove Clinton on the subjective, but entirely legitimate grounds that he was too crooked, unethical, and undignified to be president — as embodied not just by the Monica affair, but Whitewater, Travelgate, the Lincoln Bedroom and whatever else — it’s possible their crusade would have seemed a tad more reasonable. But it would have still failed anyway, simply because the American public did not share this conclusion, and Congress knew it.

President Obama is vastly less popular than Clinton, with large percentages believing he’s behaved improperly in a number of high-profile situations. Yet support for impeaching him sits at a dismal 33%, with estimates suggesting backers are around 90% Republican. And of course even in their best-case 2015 scenario, no one thinks the GOP will be holding two-thirds of the Senate any time soon.

The lasting legacy of the Clinton impeachment was the delegitimization of impeachment in general, and to the extent the episode was a gigantic waste of time perhaps that’s fair. Yet at its core, impeachment is simply a constitutional device for removing an unacceptable ruler, so it’s hard to argue the democratic interest is well-served by perpetuating this cultural stigma.

Even if the answer is no, it remains a proposition worth occasionally proposing.

9 Comments; - Discuss on Facebook - Discuss on the Forums (1)

Fresh battle lines being drawn in America’s culture wars

The culture war is dead. Long live the culture war.

It’s fashionable to observe that many of the most contentious social policy cleavages of the 1980s — when America’s “culture war” meme first went mainstream — are now the stuff of broad consensus.

Debates on the appropriate presence of public prayer have concluded in the minimalists’ favor. Universal legalization of same-sex marriage is perhaps a year away. Conservatives made peace with unwed motherhood to double-down on abortion  and paid for the strategic blunder — a large majority remains in favor of keeping the procedure legal in “all” or “certain” circumstances.

Yet as the previous decades’ debates wind down, fresh moral quandaries about the standards and values of American life emerge. Finding harmonious answers to these new dilemmas of tolerance, identity, and individualism will be a defining struggle of the millennial generation.

For as long as we’ve been fretting about discrimination it’s been said that one man’s innocent quip is another man’s slur. As overt bigotry becomes increasingly rare, it’s the innocent quips that now receive the hottest fire, zealousness having not moderated in conjunction with the lowering of stakes. Today’s tolerance activists speak of “microaggressions,” small indignities of language and manners, such as asking a visible minority where he “came from” or suggesting a woman carry the lighter box. Movies and television shows are now meticulously scrutinized using standards like the “Bechdel Test” to ensure females and minorities are portrayed in the most flattering, self-actualized fashion, with an equally fussy eye cast towards nouveau sins like “exoticism” and “othering.” Even perfectly post-modern public figures like Stephen Colbert and RuPaul have proven but a mere ill-utterance away from triggering shame campaigns from a self-appointed vanguard of “what’s not okay.”

This goal of a zero-tolerance culture is invariably at odds with unsuppressed freedom of expression and that which it produces: honest commentary, diverse storytelling, insightful humour, complexity of language and thought. The critics seek to restore the legitimacy of censorship, or at least self-censorship, in which an extra layer of nervous second-guessing must be applied to all intellectual and creative output in order to make ideas subordinate to a ruling ideology capable of punishing dissent.

Struggles over acceptable means of restraining intolerance are in many ways the outgrowth of a larger philosophical split over the nature of identity, and the privileges a citizen should claim through self-applied labels and group affiliation.

Transgender Americans have found success lobbying for legal inclusion as a class protected from open discrimination. Yet anxieties remain over the community’s existential thesis that gender itself is inherently fluid and subjective — an assertion for which the science is hardly settled, yet was recently decreed official fact by the school board of Vancouver. Today we have autistics who oppose efforts to cure their condition on the basis they were merely born different, not ill, a resistance to the “medicalization” of identity shared by an increasing assortment of unpopular demographics, including the obese (who dismiss BMI measurements as quackery), serious drug users (who euphemise their “abuse” as simply “misuse”), and schizophrenics (who self-identify as “multiples”).

Homosexuality was of course once considered a mental illness. That’s no longer the case today, yet whatever biological variables do explain the phenomenon remain ambiguous, and many are not eager to see things clarified. A faction that values the cultivation and preservation of diverse identities and stigmatizes efforts to assimilate or “fix” deviant behavior is destined to clash with those seeking definitive scientific explanations for life’s mysteries.

A similar cleavage divides those who fetishize the total supremacy of the individual against those who worry about behavior’s societal consequences. A friend of mine recently wrote an essay fretting about the sadomasochism renaissance prompted by Fifty Shades of Grey; he worried that a culture elevating individual “consent” to its highest good will be one thoughtlessly normalizing behavior that’s socially destructive in a broader sense — in this case, violence against women. Increasingly loud proposals to normalize other historic taboos — recreational drug use, prostitution, violent video games, child pornography — spawn similar concern. The anti-individualists ask at what point the pursuit of “harmless” personal pleasure corrupts the virtues of the larger society these individuals comprise. The individualist-supremacists flatly deny the possibility.

None of these tensions are particularly new, but the battlegrounds are decidedly 21st century. As opinions congeal around fresh struggles to balance choice, evidence, identity, and opinion, old understandings of the political divide are overthrown. They pose a particular threat to the progressive left, which may be doomed to split into warring post-modern and libertarian factions.

Divisive, ideological, and often personally threatening, the culture wars of the future will not be pleasant. But few wars are.

26 Comments; - Discuss on Facebook

Lack of Pride

“When is Pride this year?” straight friends ask, voices raising with equal parts excitement and condescension. As a gay, I’m presumed to be well-versed in such things, but alas, there’s no easy answer. There exists no single “Pride,” after all, simply a long sequence of independent extravaganzas across North America, each observed on conveniently different dates.

Los Angeles Pride happened long ago for instance, on the traditional first weekend in June. San Francisco pride — the big one — has come and gone as well, occurring, as it does, on the last weekend of that same month. Vancouver Pride kicks off this Saturday, and Vegas Pride comes a month after that, in early September. Countless other cities are sprinkled somewhere in-between.

Such cleverly staggered scheduling, which allows the don’t-stop-the-music set to engage in summer-long “Pride tours” across the continent, hopefully helps illustrate the fundamental vacuousness of this would-be holiday. It’s certainly one of many variables justifying my profound disinterest in it. Despite being gay for as long as I can recall, I not only shun Pride, I actively resent the implication that attendance offers any meaningful indication of one’s GLBT acceptance.

I’ve always felt a bit of sympathy for Rob Ford’s various mumbled explanations of why he’s never attended Toronto Pride during his four years as mayor. His no-show status has of course been widely taken as proof of his supposed homophobia, but his official excuse — that he’s simply an old-fashioned guy who finds the garish flaunting of sexuality uncomfortable — seems perfectly reasonable. To modern elite opinion-makers, however, who have done so much to inflate Pride as the culture’s leading litmus test of tolerance, personal uneasiness is a sentiment so exotic it may as well be uttered in Swahili.

While I’m no prude — actually, strike that, I am a prude. And what of it? Flipping through online albums of Toronto Pride 2013, one finds ample documentation of S&M bondage couples, barely-there thongs, buttless chaps, and all manner of grinding, thrusting, jiggling, and twerking. It’s perfectly acceptable to find such things gross or distasteful, and an exploitive cheapening of both sex and the body.

It is no great character flaw to value modesty or dignity, nor is it bigoted to esteem forbearance and control. Libertine attitudes towards sex, nudity, fetishism, and exhibitionism are issues entirely disconnected from the civil rights matter of whether peoples of divergent sexual orientations are deserving of the same rights and protections of those in the majority. To argue the contrary is to claim possessing a minority sexual preference should be synonymous with sexual deviancy in general — a premise not only dated, but dangerous.

There was a clever Onion piece published more than a decade ago (I doubt such a thing would be written in this more sensitive age) headlined “Local Pride Parade Sets Mainstream Acceptance Of Gays Back 50 Years.”

“I thought the stereotype of homosexuals as hedonistic, sex-crazed deviants was just a destructive myth,” the paper quotes one horrified onlooker. “Boy, oh, boy, was I wrong.”

Sounds about right. Indeed, one has to wonder just how much comfort Pride is even attempting to offer the genuinely sexually conflicted at this point. Considering how much “coming out” anxiety tends to center around fears of lost normalcy, it’s not clear at all how declaring common cause with society’s most brazen display of freakshow non-conformity is a useful means to that end.

Looking at photos of North America’s earliest Pride parades is a window into a different world. The marchers of those days, calmly holding hands with their same-sex partners in sensible polo shirts and penny loafers, were certainly subversive, but only to the extent they were seeking to remind a society in denial of the unavoidability of their existence, and the bland, non-threatening nature of it. Theirs was a call for inclusion in the most literal sense, the welcoming of homosexuals into society’s most central institutions: family, work, religion, politics, and the acceptance of their love as valid as any other sort.

That goal having now largely been achieved, the Pride movement, like so much of the modern Gay Rights activist complex, has become a victim of its own success. As North Americans get used to people being here and queer, the moderate LGBT middle class has drifted away from leadership of the tolerance movement, allowing the wild fringe to fill the void. What results is a historical irony: just as society is most eager to assert its tolerance, Pride redefines the deal. Endorsing the acceptance of ordinary people distinguishable only by what gender they love now demands an additional stamp of approval for all-purpose indecency and licentiousness.

Politicians, corporations, and all manner of interest groups clamor to agree to the terms. But for an increasing lot of gays, it’s hardly obvious why we should care.

14 Comments; - Discuss on Facebook

Indefensible Hamas

Indefensible Hamas
  •  emoticon

There are plenty of perfectly good criticisms to be leveled against the State of Israel. Personally, I’m quite troubled by the so-called “demographic time bomb” theory, which posits that Israel’s increasing Arab and Palestinian birthrates ultimately doom the Jewish nation to embrace some ugly form of minority-rule. And of course we’re all well-versed in the gross spectacle of settler expansion into the West Bank, a brazen effort at colonial growth at exactly the moment the Palestinian territories are supposed to be inching towards independence.

Yet the mere existence of Israeli sin should not blind anyone to the greater evils of its enemies.

This is the sort of blunt moral judgment that’s been traditionally uncouth among fashionable western progressives, who, often feel the need to affect great open-minded exasperation at the Israeli-Palestinian conflict, bemoaning that “fault exists on both sides.” Such is the default position of those ideologically inclined to regard assertive side-taking as a symptom of an unsophisticated mind, with “blind” support of Israel in particular a worrying proxy for some other form of close-minded ignorance  — Millennialist Christianity, perhaps.

Yet in the wake of the current war between the Israeli government and the Islamic Resistance Movement — better known as Hamas — that’s running the Gaza Strip, even the traditional progressive skepticism seems to be breaking down. As Israel’s Palestinian resisters become more nihilistic and radical at precisely the time the Israelis are getting more sensitive and cautious, the lopsided moral imbalance is becoming harder to ignore.

The traditional Israel-bashers are certainly looking more pathetic than usual. The buffoonish United Nations Human Rights Council drew up a monstrously biased report on the Gaza war the other day, which predictably sailed to approval on the votes of the various third world dictatorships who comprise the body’s largest bloc. Yet it was telling no nation resembling a first world democracy could be persuaded to support it. Of the 17 abstentions, almost all noted with concern that the Council’s chronology of the conflict was a bit one-sided, to put it lightly. The brusque four-page report does not include the word “Hamas” once, and instead speaks only of Israeli aggressors inflicting “widespread, systematic and gross violations of international human rights and fundamental freedoms” against the hapless peoples of “Occupied Palestine.”

Nowhere was it mentioned that the Gaza Strip actually ceased to be occupied back in 2005, as the late Ariel Sharon painfully extracted every remaining Jewish settler and soldier from the territory.

Nowhere was it mentioned that Hamas explicitly pledges to “obliterate” the state of Israel in their founding charter — “by Jihad,” in fact.

Nowhere was it mentioned that Hamas leaders have long spoken of “Jews” in the most generic as their enemy, and that their preferred military tactic in the current conflict — lobbing over 2,500 missiles into major population centres — have made urban Israelis the war’s true civilian targets.

Nowhere was it mentioned that Hamas has transported weapons in ambulances, housed missiles in schools, mosques, and hospitals, and disguised their fighters in Israeli uniforms — all clear violations of the codified laws of war.

Nowhere was it mentioned that the Israelis have so far discovered over 30 multi-million dollar “terror tunnels” spiraling out of Gaza (built in part with alleged child labor) that serve no purpose other than to turn western Palestine into a launchpad for guerrilla aggression against its neighbor.

Nowhere was it mentioned that just a few days prior, Hamas refused a comprehensive ceasefire backed by basically everyone who matters: the Egyptian government, the Arab League, the United Nations, the EU — even old man Obama, if anyone still cares about him.

Nor, for that matter, did the report mention the exceedingly cautious conduct of the Israeli forces in what they’re calling “Operation Protective Edge,” a reputation-conscious nervousness so thoroughly unprecedented in modern warfare it’s almost certainly harmed national security.

While Israeli civilians have been largely protected from Hamas rockets by the country’s awesome Iron Dome missile defense system, Palestinian civilians are protected by an Israeli shield of their own: an elaborate system of advanced warnings to residents of Gazan neighborhoods targeted for bombing. The system includes everything from text messages, personalized phone calls, noisemaking “dummy bombs” (so-called “roof knocking”), and even airdropped maps steering civilians to refugee centres. Such has been the IDF’s painstaking effort to mimimize causing casualties while attacking one of the most densely-packed places on earth, yet Hamas has ensured the Palestinian death toll has remained high anyway, glibly encouraging Gazans to dismiss Israeli warnings as “psychological warfare.”

Prime Minister Netanyahu took some flak for noticing that last bit, concluding on American television that Hamas seems to enjoy the existence of “telegenically dead Palestinians.” Yet it’s a indictment that’s difficult to avoid given how effective the conflict’s 570 Gazan victims have proven in forming a narrative of “disproportionate death” — the only argument Hamas can peddle for foreign sympathy. In any case, surely a group cynical enough to engage in talks with North Korea to replenish their depleted missile supply would hardly balk at the indignity of ratcheting up its own body count for propaganda purposes.

A dispassionate analysis of facts like these — facts which are not the result of clever cherry-picking on my end — but simple observation on the broad character of the Gaza conflict to date, cannot help but lead to a simple conclusion: Israel is better than Hamas.

To conclude this isn’t to posit that Israel, and the current Israeli government in particular, is without failing in other contexts, nor to even make a value judgment about the broader merits of Zionism, if you’re still a skeptic. It’s simply to note that what we have right now is a secular, liberal democracy fighting the aggressions of a lunatic death cult who seized power in a military coup and are actively loathed by the long-suffering captives it purports to rule. With tendentious conduct resulting.

Whether that’s an accurate summary of the Palestinian-Israel conflict in general, it’s certainly true of this one.

It demands an appropriate reception.

67 Comments; - Discuss on Facebook - Discuss on the Forums ()

Power Suit

Power Suit
  •  emoticon

What makes the American model of government superior to most others is its elaborate web of checks and balances. Like a mobius strip, the chart of American government depicts three branches each extending an arrow of oversight towards the other two, creating a tightly interlocking network of watchmen being watched. No matter what one branch does, the others always have venues of recourse.

On paper, at least. In practice, alas, not all checks are equally balanced.

While no one disputes the blunt effectiveness of a president vetoing a bill of Congress, the Senate refusing to confirm a judge, or a judge rejecting an unconstitutional decision of the White House or legislature, Congress’ ability to reign in the executive has always proved the most daunting challenge.

A presidential veto can be overridden by Congress, but that requires the two-thirds approval of both chambers, something only possible in the case of legislation boasting enormous, bipartisan popularity, such as the 2008 Medicare funding bill unsuccessfully vetoed by George W. Bush, or President Clinton’s attempt a decade earlier to cancel popular military spending initiatives in a variety of districts held by politicians of both parties. In all, there have been less than 10 overrides in the past 20 years.

Then there’s impeachment, which though actually easier than overriding a veto — requiring, as it does, merely a two-thirds majority in the Senate and a simple majority in the House — has become perhaps the single most stigmatized provision of the US constitution. America’s long tradition of presidential stability has made even contemplating the removal of a president mid-term a taboo of enormous proportions, a fact only further complicated by the legacy of the Clinton years, which established something of a legal-cultural consensus that presidents only deserve to be unseated for serious criminal misdeeds, as opposed to merely moral or political ones.

To be sure, Congress can handicap a president. They can defund his pet projects, as Republicans are always threatening to do with Obamacare, or simply ignore his requests for action, as has been the case with… well, you name it. But as modern presidents have embraced an increasingly maximist understanding of their constitutional powers, the rising challenge for Congress has been the question of how to restrain  a president whose most objectionable decisions are made unilaterally.

Barack Obama has often interpreted his mandate in unusual ways. A common refrain, echoed most recently during his rose garden vow to “fix as much of our immigration system as I can on my own without Congress” is that the need to make policy supersedes the need to respect constitutional procedures for making it.

In the case of immigration, the President is tilling familiar ground. In 2012 he unilaterally declared a two-year amnesty (since extended to four) for the approximately 800,000 illegal immigrants who arrived in America as children. It was a move explicitly intended to compensate for Congress’ failure to pass the so-called Dream Act a year earlier, which promised similar legal relief for America’s inadvertent aliens. Where legislation failed, rule-by-fiat would succeed.

Selective enforcement of the law has likewise been the preferred Obama approach to drug policy. In 2009, Attorney General Holder declared the United States would not enforce federal drug legislation in states that had legalized marijuana for medicinal purposes, and in 2013 he expanded that blind spot to include states that legalized it for recreational use, too. The Justice Department has announced similar plans to stop prosecuting drug offenders when they deem the mandatory punishments excessively harsh. The underlying logic, apparently, is that laws should only be upheld to the extent they serve the President’s ideological ends.

Then there’s Obamacare, whose finer points were all implemented through executive action, most notably the imposition of the everyone-has-to-have-insurance-now deadline (Congress’ law said six months ago; the President says 2016), but also this whole business of forcing employers to cover morning-after birth control that the Supreme Court recently designated an unjust burden on corporate religious freedom.

In response to the administration’s handling of the Obamacare rollout in particular, Speaker John Boehner has announced he plans to sue the White House for unconstitutional behavior, namely a dereliction of the duties mandated by Article II, Section 3: “[The President] shall take Care that the Laws be faithfully executed…” Though what specific redresses the suit will seek have yet to be disclosed, an ideal ruling would presumably compel the administration to begin imposing the Obamacare insurance mandate right away — you know, like the law was supposed to.

Is this wise? The legal establishment seems skeptical. Asking the judicial branch to resolve a conflict between the executive and legislative has little precedent in American history, elevating, as it does, the courts to the status of supreme referee of intergovernmental jurisdictional disputes — itself a proposition of dubious constitutionalism. On the other hand, the more constitutionally orthodox prescription for a Congressional problems with a president — impeachment — seems not only absurdly radical, but politically suicidal. But still, you gotta do something.

President Obama’s Republican predecessor, of course, faced constant abuse of power criticisms of his own, though it’s worth noting that much of the Bush-bashing involved disputes over what is and isn’t within the president’s prerogative as “commander-in-chief,” one of the constitution’s most disputed phrases. In the end, Congressional Democrats elected to do little more than obstruct, complain, and run out the clock — a technique Republicans may ultimately have no choice but to emulate.

Term limits have always been controversial, but they remain the only long-term defense against an executive restrained by little else.

28 Comments; - Discuss on Facebook - Discuss on the Forums (0)

The limits of liberalism

Over the last couple of decades, a dominant narrative of North American politics has been the dangers of drifting too far to the right. From Tim Hudak’s doomed bid for the premiership of Ontario to the surprise defeat of the Wildrose party in Alberta to self-destructive Tea Party campaigns across the United States, the explanation for why so many conservatives can’t get it together appears obvious to most. To paraphrase Margaret Thatcher, right-of-center candidates are placing too much emphasis on the adjective and not enough on the preposition.

Far less contemplated these days is whether there is any negative cost to be incurred from drifting too far to the left, particularly now that progressives increasingly define themselves through boastful acceptance of previously-stigmatized personal behaviour.

The aspiring candidacy of Jodie Emery, Vancouver’s so-called “Princess of Pot” and spouse of recently-released Canadian drug lord Marc Emery may prove a revealing case study.

Mrs. Emery is currently seeking the Liberal nomination in the parliamentary riding of Vancouver East, and she’s hardly hidden the fact that her primary purpose in running is to advocate for the legalization of marijuana, the Emery family’s pet cause. Legalization of marijuana is a position favored by Liberal boss Justin Trudeau, but to suggest the two are on the “same side” of the issue is to betray its moral — and electoral — complexities.

Justin’s stance is essentially a utilitarian one: he sees legalization as a way to battle organized crime and liberate an overburdened criminal justice system. Yet he’s also described consumption of the drug as a “vice” with scant social virtue. His much-ballyhooed admission of prior use was heavily coached, and if not exactly remorseful, was certainly qualified and self-conscious. His legalization plan, though vague, has emphasized the importance of keeping cannabis far from children, and he’s bemoaned that in Harper’s Canada, it’s “easier for youth to access pot than alcohol or cigarettes.”

The Emerys have a slightly different perspective, to put it mildly. As editors of Cannabis Culture magazine, founders of “Pot TV,” proprietors of head shops and seed stores, and MCs of all manner of pot conventions and trade shows, the power couple lead a subculture a that views marijuana not as an unavoidable social sin whose ills must be minimized and controlled with compassionate legislation, but an undeniably positive product with virtues worth celebrating.

“Marijuana is so good! It does so much for so many!” Marc crows in a 2010 YouTube video (ironically devoted to blasting Justin Trudeau as a “f—cking hypocrite” for backing mandatory jail times for drug traffickers). “It’s brought us everything from music to technology to cutting-edge news services, major athletes, every form of entertainment and science and architecture…”

This stance, that pot consumption is harmless, and should be completely destigmatized — if not encouraged — is probably a great deal closer to the views of the Liberal base than their leader’s cautious policy of managing risk without endorsement. Yet Justin’s pragmatism is the result of having enough sense to appreciate that elections are not decided by base alone, but the electorate’s broad centre — those much-coveted middle class, suburban swing voters who remain unfashionably inclined to regard mind-altering substances as a destructive force poisoning the culture of their children and neighborhoods. Considering that Justin’s pot stance is already taking a tremendous drubbing in suburban-oriented Conservative attack ads, it’s unclear if there’s any electoral gain to be had by embracing a darling of the stoner set who lacks even a pretence of pragmatism. In a centralized parliamentary system like ours, it only takes a single rogue candidate to upset a leader’s carefully constructed nuance, and Justin — who’s already shown himself more than capable of torpedoing troublesome candidates — is surely asking himself if Jodie’s a risk worth taking.

A similar dilemma defines the Liberal relationship with legalized prostitution.

Mrs. Emery happily endorsed the idea on Sun News yesterday, declaring it a private, consensual business transaction not terribly different (of course) from the private, consensual business of buying pot. This too, is the sort of proudly permissive position held by much of the Liberals’ ideological base, who prefer to conceptualize the buying and selling of sex as a libertarian thought experiment or abstract goal of sexual liberation. Canadians in the cautious middle, alas, inclined as they are to contemplate problems from a less academic angle, are more likely to fret about whether legalization of prostitution will simply increase its presence in their communities, as legalization of banned things is wont to do.

Though critical of the Harper Government’s John-battling prostitution bill, Trudeau’s party has not embraced the cause of complete legalization, preferring, instead, to hide behind the time-wasting excuse that more research and consideration is needed to reach an informed conclusion. It’s an even more delicate position than his stance on pot, and an even more revealing reflection of his party’s anxieties about being defined by cavalier social policies rather than practical economic ones.

Though it’s easy to dismiss his leadership as entirely frivolous, Justin Trudeau possesses great importance in defining the limits of left-wing social policy at a time when many progressives are inclined to regard “going too far” as the exclusive disorder of the right.

Assuming her candidacy is serious, Jodie Emery will be the canary in the mine.

6 Comments; - Discuss on Facebook

Differences become starker in North American democracy

Since the differences between Canada and the United States are almost all political, we can learn much from the two countries’ recent deviations in the practice of democracy. Just as Canada’s rulers seem to be consolidating their privileges in an increasingly authoritarian parliamentary system, Americans have witnessed a number of inspiring episodes as of late highlighting the comparatively open nature of their republican institutions.

On June 10, Eric Cantor, the Republican House leader, was defeated in a primary election to continue representing his party in Virginia’s 7th district. It was the first time in American history a party leader had lost office in this fashion, and the greatest victory to date of Tea Party insurgents, who had never before unseated a politician of such standing.

Regardless of what one thinks of Cantor, or the right-wing arguments against his credibility as a conservative, the idea that a national party leader could be so easily overthrown simply through populist dissent in his own community says good things about the health of America’s representative democracy. Cantor, it was often said, harbored ambitions of being Speaker of the House someday, yet in the end it was his lack of respect for his present duties as a representative— namely, to represent his community —that ultimately torpedoed his career. He ran an aloof, condescending campaign (most glaringly personified by the fact that he wasn’t even in his state for most of election day) and took it for granted that his status as a national figure insulated him from domestic accountability. And he paid the price.

The opposite was true in Mississippi last week — though only barely. There, six-term Republican senator Thad Cochran kept his party’s loyalty by the narrowest of margins, winning the state GOP renomination by less than half a percent in the June 24 primary. Though his opponent, Tea Party-backed State Senator Chris McDaniel, has proven something of a sore loser, it’s clear Cochran won simply by playing the game better. In a state that’s nearly 40% African-American, Cochran appealed to the liberal sensibilities of black voters by playing up McDaniel’s harsher flavor of conservatism, and unapologetically embracing that which made him so loathed by Tea Party-types in the first place: his talents at ensuring Mississippians always amply benefitted from Washington contracts and subsidies. Or, as both backer and opponent alike were fond of putting it, his ability to “bring home the goodies.”

Neither of these stories would be possible in Canada. In this country, after all, party nominations are not accountable to voters at large, because Canadian political parties are not seen as public utilities within the nation’s democratic system, but privately-owned entities that operate independently within it — and tolerate only the barest minimum of public participation in their internal affairs.

In the United States, one becomes a party member — and thus an eligible primary voter — simply by declaring himself to be one. In Canada, the privilege must be purchased and continually renewed, and can be withdrawn by party elders at any point for misbehaviour. Despite the fact that most Americans are not generally interested in primary elections, over 65,000 Virginians voted in the Cantor race and 300,000 Mississippians in the Cochran one. By contrast, only 100,000 Canadians voted to make Justin Trudeau leader of a national political party (a participation rate of around .4% in a country with 24 million eligible voters). And that was an unprecedented high. Only 1% of Canadians are said to be registered members of political parties, but it’s impossible to know for sure, since the parties tend to be fairly cagey with their membership figures. That same caginess ensures we have no idea how many people are voting in MP nomination races, as Canadian political parties are not by law required to disclose such data to the media or anyone else.

Canadian party elites would no doubt find the fact that Senator Cochran was re-nominated, in part with the support of Democrats (as many of his black voters certainly were) thoroughly ghastly, but “open primary” states like Mississippi, in which voters choose for themselves which primaries they want to vote in — regardless of their party registration — the principle is that politicians are accountable to the voting public as a whole, rather than one narrow faction of it. Democrats have a right to ensure Republicans don’t get too conservative, Republicans have equal right to ensure Democrats don’t get too liberal. If done correctly, the result can be a less polarized, centrist party system in which even in periods of one-party dominance, the opposition can still exert some influence on outcomes.

That both the Cochran and Cantor results were upset shockers similarly highlights the degree of unpredictability in the American system, the very thing Justin Trudeau is currently waging his merry little jihad against as yesterday’s promises of “open nominations” decay into today’s practice of installing preferred candidates light bulb-like across the country. Doubtless much of the GOP national establishment did not want to see Congressman Cantor go down, yet because the American parties have no authoritarian bosses, there was no one available to pull a Trudeau and insulate him with the leader’s stamp of approval, as J-Tru did in anointing Adam Vaughan and Chrystia Freeland, his nominees of choice in successive Toronto by-elections.

Speaking of authoritarian bosses, last week was also notable for the US Supreme Court’s 9-0 smackdown of President Obama’s attempt to run ’round Congress and unilaterally appoint judges and federal board members without first seeking the Senate’s consent, as required by the constitution. Obama’s defense was that Article III gives him the right to install whomever he wants so long as the Senate’s in “recess,” and thus unable to convene to give his picks scrutiny, but as Justice Ginsburg said during arguments, in the age of jet travel “the Senate is always available.” Obtaining Congressional approval might be a slow and painful process, but so long as Congress claims it’s available to sit and consider presidential nominees — even if that availability consists of minute-long perfunctory sessions during breaks that exist only to signal their own availability — a president who claims there’s just no way to get an up-or-down vote for some guy he wants to stick somewhere is either ignorant or dishonest.

It was a fascinating clash of all three branches in which the Court upheld the legislature’s right to scrutinize executive branch appointments as one of the fundamental principles of American democracy — a principle, once again entirely unknown in Canada, where prime ministers just happily install whomever they please. In the US, unqualified Supreme Court judges get vetoed by Congressional consideration hearings. In Canada, they get expelled by the Court itself — but only after being inaugurated and having collected several months pay, as was the case with Justice Nadon. In America, cabinet ministers are expected to be eminently qualified for their positions —because the Senate reads their resumes line-by-line. In Canada, you simply wake up one morning and find Peter MacKay is attorney general for some reason.

Whatever polite pretences Canadians are given for the absence of truly open nomination races and greater scrutiny of prime ministerial appointments — stability, protection from extremism, anti-American contrarianism — the Occam’s razor explanation is clear enough: the elites at the top of the Canadian political pyramid simply don’t want their absolute powers diluted by a lot of fussy checks and balances. Ours is a system in which bottom-up input on important decisions — either from the people’s representatives or the people directly — is a force to be feared and distrusted.

Over the last few weeks, Americans have been reminded that despite its many flaws, their constitutional system is still one that provides considerable safeguards to ensure the little guy can triumph over the big. In Canada, the big guys simply trample — and we’re supposed to be grateful for the privilege.


1 Comment - Discuss on Facebook

Opposed to What?

Opposed to What?
  •  emoticon

The rise of the fundamentalist Sunni terror group ISIS in Iraq over the last couple of weeks has provoked critics of the Iraq war to new heights of smugness. ISIS, of course, is the Taliban-like entity that split from Al-Qaeda in 2013, largely as the result of petty politicking between its leader, self-styled “caliph” and would-be global overlord Abu al-Baghdadi and Osama Bin Laden’s owlish successor, Ayman al-Zawahiri. According to the latest charts, ISIS now controls over a dozen Iraq cities, including Mosul, the country’s second-largest, and we keep being warned their march on the capital is imminent.

Whether or not ISIS is actually a viable “government in waiting” remains far from clear — they have little experience holding territory, let alone running it — and sensationalistic media analogies framing the group as the North Vietnamese army descending on Saigon display a lazy misunderstanding of both conflicts. Yet either way, the sheer horror of these would-be caliphaties — the mass-murders of captured soldiers, the nightmarishly medieval social codes, the unapologetically imperial ambitions to rule the entire Middle East within five years  — has made them a powerful symbol for everything wrong with America’s 2003 intervention in the first place.

Such is the supposed vindication of the anti-war left, whose members have been endlessly applauding their own retrospective rightness as of late. Told you so — the war was a disaster. You should have listened to us! 

But really, should we have? Bad decisions have to be viewed in the contexts of the debates in which they were reached, and even in these dark days, it remains an open question whether anyone on the left was actually offering a viable alternative to war in 2003. Or, to put it more specifically, whether anyone on the left was offering a morally coherent, non-war strategy for dealing with the rogue regime of Saddam Hussein that America would have been wiser to pursue.

As someone who politically came of age during the 2003 lead-up to war, I remember well the left’s discomfort in dealing with the Saddam question — a particularly awkward position to be in, given the dangers and brutality of the Saddam regime were very much the central focus of the entire war conversation.

Anti-war liberals desperately wanted to maintain their credibility as the self-proclaimed defenders of human rights and democracy, and understandably so. Yet this meant the most logically contrary anti-war position (and certainly the position that would seem most justifiable in the current context) — that Saddam Hussein, bad as he was, was a force for Iraqi stability and secularism, and thus better than the alternative — was not merely avoided, but actively denounced. Indeed, if anything, a common liberal refrain at the time was to claim it was actually those right-wingers in the White House whose anti-Saddam credentials were most dubious.

Did not some of those former Reaganites have former careers as Saddam boosters during his war with Iran in the 1980s? Did not we have a photograph of Rumsfeld himself shaking the dictator’s hand? Was it not the Republicans who turned a blind eye when Saddam gassed the Kurds in 1988? Didn’t a Republican commerce secretary allow Saddam to import deadly dual-use chemicals for his WMD arsenal? Did not the president’s father make a conscious choice to allow Saddam to remain in power at the end of the first Gulf War — then brag about it in his memoirs?

Beginning every anti-war argument with a sort of perfunctory throat-clearing about Saddam’s obvious wickedness became a pronounced tic of the anti-war set in those days, yet their accompanying lack of strategy for confronting the evil they just acknowledged was large part of the reason they ultimately lost the war argument.

War proponents cried incohrence and hypocrisy and they were not wrong. During the 1990s, after all, many of the same people who would oppose the 2003 invasion were also steadfast opponents of the UN’s post-Gulf War Iraqi sanctions, which they blamed (and not unjustifiably) for tremendous death and suffering. Yet this made the left-wing war position the very definition of an unwinnable paradox: Saddam should neither be removed forcibly, nor sanctioned, nor supported, nor ignored. Perhaps some felt he should have been overthrown internally (though I remember a lot of snide words about how the Bush administration was plotting to swap “one dictator for another” back in the days when the dissident Iraqi politician Ahmed Chalabi seemed to be a White House darling). Perhaps some felt Saddam had the capacity to democratize himself, a la Emperor Hirohito after World War II. But even counter-proposals as strained as these were not made. The cowardly have-your-cake-and-eat-it-too stance that should be both no war and no Saddam was exactly the sort of moral bankruptcy that drove many principled left-of-center intellectuals like Christopher Hitchens and Michael Ignatieff to abandon the anti-war cause, and sleazy demagogues like George Galloway and Michael Moore to assume larger roles within it.

No, it was only the fringe anti-war right — your Pat Buchanans, your Lew Rockwells, your Robert Novaks — in all their isolationist xenophobia, human rights indifference and America uber alles supremacism, that offered a genuine alternative to the war to unseat Saddam, though it was an alternative so ugly and mean-spirited few bothered to take it seriously.

Hussein was a brutal dictator they conceded, but Iraq was also a preposterous artificial country filled with hateful, warring savages that needed a strongman’s iron grip to keep everyone from lunging at each others’ throats. Islam was a cruel and violent religion thoroughly incompatible with democracy — give Iraqi Muslims the vote and they’ll simply elect fundamentalists to oppress themselves further — witness Hamas in Palestine. American foreign policy should never be about righting all the world’s wrongs, merely upholding whatever state of affairs keeps the United States rich and safe. Saddam Hussein’s murderous energies were reserved for his own people (or at worst his neighbors) and he was perfectly content to sell Americans oil. In fact, as noted, Saddam actually had some history as a man with whom the United States could do business, and he was admirably hostile to the region’s most ferociously anti-American regime — Iran.

Vindicated by recent events or not, it was a fringe opinion for a reason. Such coldly self-serving logic is not consistent with mainstream American morality, particularly the uniquely American notion that theirs is not a global hegemon like the nasty empires of yore, but a kind and empathetic republic whose foreign policy embodies the same neighbourly virtues of trust, sympathy and generosity its people practice in their day-to-day lives, and honours the same democratic principles abroad that are protected by its progressive constitution at home. As American anxieties over everything from the Rwandan genocide of 1994 to the Assad massacre of today have proven, the ethical lens through which we view the justness of American foreign policy continues be of the either-or variety — the United States can either actively alleviate the suffering of others or be somehow complicit in it.

As the country burns, the most profound intellectual legacy of the Iraq war will be the degree to which a quintessentially American conception of geopolitical power as a force to be dictated by emotions of idealism, responsibility, and guilt — held since at least the Second World War — begins to break down, and Americans, especially Americans of the left, are able to accept that the higher goal of “peace” often requires an explicit, callous indifference to the loss of foreign life and foreign freedom in the name of stability.

Running offensively counter, as they do, to deeply-entrenched American values on all sides of the political spectrum, such arguments may not stick.

They certainly didn’t in 2003.


54 Comments; - Discuss on Facebook - Discuss on the Forums (5)

Constructed Canadians

I get a little exasperated sometimes with the customary avalanche of carefully-constructed Canadian Pride™ that gets unloaded upon this nation every July first. This year’s lead offering was a supposed “most Canadian music video ever” released by Commander Chris Hadfield, of International Space Station fame, and his lesser-loved brother David, who as far as I know, possesses no fame at all.

There are lot of cloying, irritating things about In Canada, as the Hadfields’ song authoritatively calls itself. Chris Hadfield, for starters. How about taking a powder, Commander? The man was in outer space for a few months — he didn’t discover a new planet. Yet it seems every week he still manages to claw himself back into the headlines to bask in yet another round of adulteration whipped up by a compliant press, who are now basically functioning as his full-time publicity agents. It’s probably only a matter of time before someone appoints him to the Senate. Possibly himself. A modest hero he’s decidedly not.

In any case, the Hadfields’ viral hit suffers from all the same deep-seeded structural flaws that invariably turn any attempt at summarizing the Canadian experience into an exercise in disingenuous cornballism.

The base problem is that any Canadian seeking to encapsulate our collective essence always starts from the questionable assumption that nothing that “defines” our nation is also allowed to be present in America. Canadians are taught that national identity is a zero-sum game; nationalistic quirks must be owned, never shared. So we’re never told that the Canadian experience includes eating hot dogs or watching football or shopping at Wal-Mart or other mainstream rhythms of Canadian life because those are things the dreaded Americans do too, and we’re trying to be distinct here. The fact that football and hot dogs and Wal-Mart are precisely the kinds of things that make us distinct in the eyes of 96% of the planet matters little — the implicit audience for any well-curated manifesto of Canadian pride is always Americans, who are assumed to be interested, or other Canadians, who are assumed to be insufficiently aware of their own uniqueness and thus in need of endless lecturing on the matter.

The end result is that Canadian attempts to generalize their un-American distinctness invariably fall into three tendentious trends of stale rhetoric, all of which are on ample display in the Hadfields’ video.

Inspiringly patriotic anecdotes must either be exceedingly pointless and superficial (Chris sings about how “we love Nanaimo bars” and hoard “Canadian Tire money in at least one kitchen drawer”), exceedingly parochial (perhaps in Ontario they “wear Sorels in winter, while plugging in the car” but that’s certainly not the case in comparatively mild Vancouver), or insufferably dishonest and braggy (anyone who claims “you don’t butt in in Canada” has clearly not boarded a bus in this country).

It is not obvious at all what makes eating certain foods or shopping at certain stores a more fundamental part of the “Canadian” experience than buying or eating things that originate in America (or some other foreign place). If we’re looking to define the Canadian essence by Canadian consumerism, surely the standard should be the consumption of products that are actually popular, in a day-to-day sense (Hamburgers! Spaghetti! Ice cream!), as opposed to ultra-particular novelties like Nanimo bars that we may come into contact with, what — once a year? Indeed, the reason we all have drawers of Canadian Tire Money is because none of us shop there often enough to use it.

Nor can it be taken for granted in a country as enormous as Canada that any quirk of geography or weather — no matter how postcard perfect it may look on YouTube — can be generalized into a familiar experience without alienating large chunks of our widely-dispersed population. Not all Canadians experience winter in the snowy fashion so common back east, and the rocky mountains are known only in theory to the plains people of the prairies. To assert otherwise is to establish a hierarchy of experiences, in which colorful activities only a small sliver of the population will ever get to enjoy, like “paddling your canoe,” are given precedence over Canadian activities that are actually ubiquitous and unifying. Say, jogging.

Then there’s the cloying righteousness that comes with positing universal, aspirational virtues like politeness and respect as something Canadians unquestionably are, rather than imperfectly strive to be. I would have thought the antics and ongoing popularity of Rob Ford would have put a bit of a damper on those living in arrogant denial of the existence of rude, mean-spirited Canadians, but I suppose the wonderful thing about vanity is that it’s rarely weakened by fact.

What actually makes Canada an admirable country is our constitutionally-protected values — democratic self-rule, individual liberty, minority rights — that have kept our people safe and free, and our collective commitment to those civic virtues — labor, business, family, education, community, faith — that have made us wealthy and happy. These principles are not eccentric or splashy, and Americans have them too. But in the grand global scheme of things they remain tremendously rare, and that’s probably the more useful standard.

What makes Canada interesting, likewise, is the fact that we’re a British-founded, primarily European-settled, mainly English-speaking country located on the fertile, empty continent of North America. This, again, is a status we share with the United States, but it’s still a status 96% of the planet can’t claim, and thus still forms the essence of “what it means to be Canadian” in the most honest sense. “Canadian culture,” in turn— in the unglamorous sense of how we live, eat, work, talk, love, worship, and play — is far more tied to our common history and heritage with America than our few trivial deviations from it.

To posit otherwise in the name of patriotism, to fetishize random consumer goods or uncommon outdoor adventures or non-existent personality traits into the core of the Canadian identity, is to create a national narrative in which insecurity, arrogance, and dishonesty are in ample abundance, but genuine pride for who we actually are — not so much.

4 Comments; - Discuss on Facebook

Salvation through symbolism: the Canadian elite’s new ideology of aboriginal policy

On Wednesday, a group of people claiming to be something called the “Vancouver City Council” passed a motion conceding that their quote-unquote “city” is actually located on “the unceded traditional territory of the Musqueam, Squamish and Tsleil-Waututh First Nations.” The self-styled “mayor” declared it a historic moment.

It has long been a marked affectation of the British Columbia elite — be they political, legal, academic, or cultural — to ostentatiously emphasize their awareness of grievances committed against First Nations in centuries past by systematically undermining the legitimacy of the “settler” society that’s been established since. The most pronounced tic is the habit of beginning virtually every public address — a politician’s stump speech, a university commencement, the call to order of the human resources subcommittee of the library board, etc. — with a sombre, yet smug reminder that whatever glories are about to unfold are occurring on the “traditional, unceded territories” of this-or-that native band. Often the ritualism goes even further: the Liberal Party’s 2013 Vancouver leadership debate, for instance, was kicked off with the party president handing a bag of tobacco to a local aboriginal leader as an anachronistic offer of concession.

Many political gatherings have similarly eschewed the settler tradition of opening proceedings with a vaguely Christian prayer in favor of explicitly aboriginal hosannahs, usually with lots of drumming and chanting. In 2011, Mayor Robertson chose to take a pass when it came to swearing his oath of office on a bible, but had little problem being blessed by holy men of the tribes whose land he sought to unjustly rule. Indeed, amid all the “whither the housing market” talk of their Wednesday declaration, it’s worth noting that city council ostensibly declared their own illegitimacy not to address any outstanding property dispute, but simply to get the ball rolling on organizing even more culturally-acceptable welcoming ceremonies than the ones they already hold.

Female B.C. politicians — afforded, as they are, more opportunity to make political statements with fashion — now routinely attend public events wearing scarves, blouses, jackets, dresses, and jewelry decorated with the distinctive art of West Coast aboriginals. The old Lieutenant Governor of the province, Iona Campagnolo, claimed one of the proudest achievements of her otherwise meaningless term was commissioning herself a beautiful new official uniform embroidered with intricate silver cross-stitching of whales and eagles depicted in traditional Coast Salish style. When the mayor of Kitmat was flash-mobbed by a gaggle of aboriginal activists protesting the proposed Northern Gateway Pipeline a few months ago, one of the great optical ironies was that the mayor herself was decked out in a floor-length coat emblazoned with aboriginal art at the time.

Then there’s the war on settler names, a cause that’s been quite enthusiastically championed by British Columbia’s current Liberal government. Ex-premier Gordon Campbell in particular went on quite the kick during his final years in office as he sought to restore all manner of “traditional” aboriginal names to prominent pieces of provincial geography. In 2010 the former Queen Charlotte Islands became “Haida Gwaii” (though not overnight — Premier Campbell grinned himself through a whole ceremony for that, too), and the Strait of Georgia was rebranded the “Salish Sea.” He wanted Stanley Park to go back to being called “Xwayxway,” too, until the feds who own the park (or I suppose we should say, “claim to own”) put a kibosh on that one.

All this sort of stuff is obviously motivated by a certain kind of white liberal guilt; the idea that the imperialistic horrors of one’s racial ancestors can be atoned through a kind of showy self-loathing, combined with a “better-late-than-never” embrace of the culture of the conquered. This, to use the trendy term, is a way of bringing “social justice” to the aggrieved party, which in today’s touchy-feely times is considered every bit as important as the real thing.

The awkward fact, alas, is that the anti-colonialist affectations of British Columbia’s white ruling class are little more than products of their own limited colonial imaginations, particularly deep-seeded Christian notions of sin and guilt, which even in these post-Christian times, are proving remarkably entrenched in “settler” culture.

In the eyes of today’s anxious white overclass, after all, “traditional” aboriginal cultures and societies primarily function as a sort of secular Garden of Eden — an idyllic, if somewhat vague and ahistorical locale entirely free of want or worry. The unfashionable modern sins of racism, war, capitalism, old-world religion, and environmental degradation, meanwhile, are viewed as the poisons of Europe, and a burden of “original sin” carried by all the continent’s ethnic descendants. By extinguishing, undermining, or subordinating signifiers of European culture, one engages in an act of spiritual self-flagellation and takes a step towards restoration of the fallen utopia — and thus moral salvation.

The unfortunate thing about treating aboriginal policy as a path to spiritual betterment, of course, is that endless rituals of symbolic atonement do little to alleviate the unavoidably temporal sufferings of Canada’s aboriginal population. A new name for a park won’t provide clean water for a polluted reserve; a smudge ceremony at the school board won’t raise abysmal aboriginal graduation rates.

Worse still is the fact that the postmodern moral code of BC’s white overclass also insists on pushing the destigmatization and acceptance of dangerous social vices of which aboriginals are the disproportionate victims — chiefly drugs and prostitution — while simultaneously demonizing sectors of the economy which present the community’s surest path to greater prosperity and self-sufficiency —which is to say, natural resource development.

In other words, far from heralding the bold progressive break they imagine, there’s little reason for aboriginals to believe the latest animating ideology of Canada’s elite will offer much substantial improvement over previous self-serving ideologies of condescension and indifference that caused so much suffering in generations prior.

2 Comments; - Discuss on Facebook


  • Recent Posts

  • Cartoon Archives