Archive for May 2012

Media Bias and the Sin of Omission

I’ve become increasingly fascinated by media bias. It’s a charge that is quite old on the Right, dating back to the Vietnam War when Walter Cronkite stopped reporting about the Tet Offensive in February 1968 and instead voiced his opinion that the war was lost, “(I)t is increasingly clear to this reporter that the only rational way out then will be to negotiate, not as victors, but as an honorable people who lived up to their pledge to defend democracy, and did the best they could.”

Cronkite spoke at the beginning of a time of limited media outlets. The consolidation of Big City newspapers was in full swing, providing most Americans with increasingly limited number of perspectives in print. Television news was dominated by the Big Three who spent vast sums of money on news desks around the world, but limited perspectives that appealed to the small cadre of bosses at the top of the three organizations. It was also at this time that those educated by a leftist dominated academia graduated with journalism degrees began replacing those who learned reporting as a trade before World War 2 or in many cases, during it. Since only a quarter of Americans attended college at the time of the Vietnam War, the requirement of a journalism degree made journalists less representative of the population and more elitist. While the American population as a whole is considered center-right ideologically, journalists tend to lean left in contrast to their audience.

The rise of the Internet ended this choke-hold on information by the liberal elite, and put alternative viewpoints a mouse-click away. This explosion of media forced readers to react to news in a way that their grandparents and great-grandparents would have understood. “Don’t trust everything you read in the newspapers,” was a cliche over a century ago during William Randolph Hearst’s fight with Joseph Pulitzer, when Hearst employed tactics derided at the time as “yellow journalism.” Even though readers in the late 19th and early 20th century lacked the Internet, television and radio, they still had access to newspapers from across the political spectrum, often in foreign languages and from divergent perspectives that made the media of a century later look bland and monocultural by comparison. Today’s news readers can read an op-ed in the Wall Street Journal, view a contrasting opinion in the New York Times, find the European perspective on the topic in the Economist, see what the German’s think at Der Spiegel, read what liberals think about the subject at Huffington Post, and get the libertarian viewpoint at Reason. A little over a century ago a news reader would have been able to do the same using local newspapers.

Bias isn’t bad. It’s unavoidable, and should be accepted as a human quality to the reporting and analysis of events. As a conservative-leaning libertarian I can still read the New York Times. Even Mother Jones offers useful information to a reader who is aware of its bias and questions what he or she reads. But what’s often as interesting as what they print is what they omit.

Walter Russell Mead is an avid reader of the New York Times, but he finds it interesting that any person who relied solely on the New York Times for their knowledge of the Scott Walker recall effort in Wisconsin would be completely unaware about the likelihood of Walker winning the recall. “But what Times readers will not learn from this piece is that (Walker) is winning. Walker is overwhelmingly favored to win on June 5, with polls consistently giving him a significant lead over his opponent. In seven pages of focused, detailed coverage of the politics of the Wisconsin race, the piece has no room for this simple yet somehow telling detail.” Mead relates other New York Times coverage that fails to explain to New York Times readers why Walker is supported by the majority of Wisconsinites. “(W)e don’t learn anything at all, really, about why people support him — or why so many of them are furious with the unions and their supporters. In an article about the bitter political divisiveness consuming Wisconsin, we learn nothing about the actual nature of the divide.” Mead characterizes this failure as “foolish and self-defeating propaganda” worthy of an “anti-Pulitzer Prize for the worst journalism of the year,” and blames such pieces for why “liberals are so frequently surprised by events that other people saw coming.”

Omission shouldn’t surprise a critical reader. While I value the Drudge Report, I’ve learned that on Thursdays when weekly unemployment figures are released they are likely to appear on Drudge when they are bad. If I log in on Thursday morning and see no mention of the figures, I know that they likely came in above expectations, and that the current administration will take credit for them. It’s also nothing new. Both the Germans and Japanese governments reported great military successes in World War 2 up until they surrendered, a strategy employed by the Iraqi government with the comical “Baghdad Bob” Mohammed Saeed al-Sahaf announcing great victories against allied forces as US tanks lumbered into the background shots behind him. The slant of a report might be more obvious than its complete omission, but a discerning reader should question what is not being reported regardless.

Libertarians and conservatives have existed under a liberal-dominated regime for decades now, and are much more sensitive to bias than their liberal and leftist counterparts due to the constant assault on their beliefs. For example, I know that any article in the New Scientist, a leftist-leaning science publication out of the UK, will blast climate skeptics for questioning the “consensus” of climate change, while in the very same issue publicize the dangers of fracking based on anecdotal evidence that has been consistently refuted by scientific studies. Yet I still read the magazine.

Mead implies that liberals have had it easier, with their world views and ideologies confirmed by the mainstream media they consume, and that this media regularly misleads them until Reality crashes through and surprises them. Michael Barone notes, “Liberals can protect themselves better against assaults from outside their cocoon. They can stay out of megachurches and make sure their remote controls never click on Fox News. They can stay off the AM radio dial so they will never hear Rush Limbaugh…The problem is that this leaves them unprepared to make the best case for their side in public debate. They are too often not aware of holes in arguments that sound plausible when bandied between confreres entirely disposed to agree.” Conservatives and libertarians don’t have that option. Liberalism is everywhere and has been the dominant ideology in the media for the past two generations. But this exposure has deepened conservative and libertarian thought as beliefs are challenged on a daily basis, and the employment of critical thought has allowed them to read liberal media without succumbing to it. Liberals have not done the opposite. They don’t listen to Rush Limbaugh or watch Fox News; they avoid them, and worse, attack them in the hope of driving them off the air. Their attempts at competing with Limbaugh with liberal talk radio have failed. Fox News dominates cable news as CNN and MSNBC’s audiences wither. Instead of reviewing their positions and perhaps updating them to reflect the times, they run to the government to have them banned.

Remembering Douglas E. Sloan

On this Memorial Day I think it’s worth remembering one of our fallen, Major Douglas E. Sloan, whose headstone appeared on today’s Drudge Report. Major Sloan is interred in Arlington Cemetery.

Drudge Report on 5/28/2012 - Sloan Headstone

According to ArlingtonCemetery.net, Major Sloan was killed on October 31, 2006 near Wygal Valley Afghanistan when the convoy he was riding in was hit by an IED. Sloan was assigned to the 1st Battalion, 32nd Infantry Regiment, 3rd Brigade Combat Team, 10th Mountain Division, Fort Drum, New York. Sloan, a father of four, was survived by his parents Wendy and Emory Sloan, a retired VA administrator.

His 8 year old son Dylan said of his father, “”He always had a smile on his face. He liked playing with me.”

Major Douglas E. Sloan. Son. Father. Hero.

The Council Has Spoken: May 25, 2012

Congratulations to this week’s winners.

Council: JoshuapunditWhat Does Peace Mean Anyway?

Noncouncil: The Other McCain- Brett Kimberlin Saga Takes a Bizarre Turn, Forcing Me to Leave Maryland

Full voting here.

BlogBurst Friday for Free Speech

Being a contrarian, I tend to avoid moving with the crowd. So when Lee Stranahan came up with the idea of “Everybody Blog About Brett Kimberlin Day,” and my mailbox became full with chatter of my blogger friends, I initially ignored it.

But even more than moving with the crowd I hate bullies, and in my humble opinion Brett Kimberlin is a bully – a convicted terrorist nicknamed “The Speedway Bomber” who was found civilly responsible for the suicide of one of his victims as detailed in this International Business Times piece. Fast forward to the present, and now he’s head of a foundation that has received $1.8 million from George Judenrat Soros’s Tides Foundation. But once a terrorist always a terrorist, and Kimberlin, as reported by the International Business Times has allegedly taken to threatening right wing bloggers, forcing Robert Stacy McCain to flee his house.

For more information click here.

If libertarian and right wing bloggers had a convicted terrorist among our ranks we would be pilloried, and rightly so. Yet somehow bombers like Kimberlin and Bill “Premature Detonation” Ayers get feted by the Left. Why? Because for the Left the ends always justifies the means.

Geert Wilders: A 21st Century Canary in a Coal Mine

Mark Steyn had a good piece about Geert Wilders, a Dutch politician that is living under threat of death by various peace loving Muslims. Steyn pointed out something that I’ve often noticed with anyone who dares to question Leftist orthodoxy, the usage of adjectives such as “far” and “extreme” to describe them by reporters. Steyn noted, “the determination to place him beyond the pale is unceasing: “The far-right anti-immigration party of Geert Wilders” (The Financial Times) . . . “Far-right leader Geert Wilders” (The Guardian) . . . “Extreme right anti-Islam politician Geert Wilders” (Agence France-Presse) is “at the fringes of mainstream politics” (Time) . . . Mr. Wilders is so far out on the far-right extreme fringe that his party is the third biggest in parliament. Indeed, the present Dutch government governs only through the support of Wilders’ Party for Freedom. So he’s “extreme” and “far-right” and out on the “fringe,” but the seven parties that got far fewer votes than him are “mainstream”? That right there is a lot of what’s wrong with European political discourse and its media coverage: Maybe he only seems so “extreme” and “far-right” because they’re the ones out on the fringe.”

I’m a fan of Geert Wilders, as I was of another noted Dutch politician, Pym Fortuyn. Like Wilders Fortuyn was tarred with the extremist label, probably the first and only openly gay man ever slandered by the Left as a far right anything. Fortuyn didn’t see himself that way, likening himself to center-left politicians of the day, and was an ardent admirer of American President John F. Kennedy. Like Kennedy Fortuyn paid the ultimate price for his views, gunned down in broad daylight by Volkert van der Graaf, a self-described environmental and animal rights activist who acted in defense of Muslims and “weak members of society.” Wilders has yet to pay this price, but has to move discreetly between safe houses to avoid it.

As Steyn notes, Europe’s multiculturalism that has allowed Islam to thrive without any push back has resulted in a society where gays are hunted without fear of persecution, women and children are raped, and Jewish children are legitimate targets living on borrowed time. Muslims are free to exercise their intolerant views on everyone as they see fit, and those who dare fight back are labeled as Islamophobes and far-right extremists by the very people under greatest threat. When the editor of DC’s gay newspaper the Washington Blade and his boyfriend get beaten up in Amsterdam by 7 Moroccans, and Muslim apologists explain away the attacks as kids unsure of their own sexuality, you know something has gone terribly wrong in Holland.

Islamophobia is an irrational dislike of Islam. There is nothing irrational about refusing to tolerate a religion that views women as less than property, all other religions and political institutions as invalid and heretic, and homosexuality as an abomination punishable by death. There is also nothing irrational about despising a religion whose adherents have called for your death. Yet this is exactly what has happened with Fortuyn and now Wilders.

Throughout world history Europe has been a place where ideas, ideologies and civilizations mix and occasionally clash. Like all complex problems, there is more going on in Europe than just the spread of Islam.

Europe had a long history of Jewish pogroms and persecution long before Adolf Hitler came to power and instituted the Final Solution. Deportations and massacres of Jews were common on the continent well before then, so in a sense Europe’s default state is anti-Semitism. The aftermath of World War 2 changed that briefly as local Europeans were paraded through the concentration camps to see what their hatred wrought, and the guilt caused by the Holocaust swung the elites behind the Jews and the nascent Jewish state of Israel. For decades after it’s founding Israel’s primary supporter was not the United States, it was France, and the ties went beyond the love of socialism that Jews share with Europeans, there was guilt as well. It wasn’t until de Gaulle himself switched sides and backed Israel’s Arab enemies starting in 1967, setting a policy that has continued since. The return to its innate anti-Semitism was complete when French ambassador Daniel Bernard stooped to scapegoating the Jews for all evil in the world, saying in 2001 “All the current troubles in the world are because of that shitty little country Israel.” The problem with guilt is that it’s not static. It gets old and begins to change and when it does it easily changes into hatred. One can only feel guilty for so long before the pain of guilt turns to jealousy towards those in whom the guilt is directed at. It’s a short step from that emotion to hatred, and it’s a step that Europeans all over the continent have taken.

James Oberg, a NASA scientist and engineer once quipped, “You must keep an open mind, but not so open your brains fall out.” The origins of multi-culturalism lies in cultural relativism, the belief that all cultures are equal. In order to achieve that equality multi-culturalists downplay the success and achievements of the dominant culture, criticizing its success as originating from the exploitation and domination of weaker cultures while exaggerating the latter’s achievements. Multi-culturalism became possible after the one culture took a dominant position in the world, and after World War 2 that culture was Western civilization based on Greco-Roman democratic ideals with Judeo-Christian morality supported by Anglo-American capitalism. Multi-culturalism attacked all three of these aspects of western culture in the post-war world. Having become entrenched in academia and to a lesser but substantial degree in non-elected governmental bureaucracies, multi-culturalists pushed for an end to the assimilation of immigrants into a country, viewing it as state enforced cultural genocide. As the western economies in Europe grew, they drew in millions of immigrants from around the Mediterranean, North Africa, and the Middle East. Because these immigrants were not forced let alone encouraged to assimilate, they found themselves at the fringes of their host societies, unable to speak the host nation’s language or participate in its civil life. Multi-culturalists quickly blamed the racism for this failure, unable to understand that contrary to their philosophy there are significant differences between western and Islamic culture, and that saying the two are alike shows an ignorance of both in the same way that Emerson took issue with the fallacy that all men were the same: “The wise man shows his wisdom in separation, in gradation, and his scale of creatures and of merits is as wide as nature. The foolish have no range in their scale, but suppose every man is as every other man.” Multi-culturalists now find themselves trapped by their ideology, defending the gender inequality and intolerance of Islam while unleashing its fury on any one who challenges it. They continually side with and condone the actions of wife beaters and gay bashers and murderers, the very people they are supposed to represent and in many cases are. In short their brains have fallen out.

These two changes in Europe, the return to its default anti-Semitism and the development of multi-culturalism that prevented assimilation of Muslim immigrants, would not have together ended the liberal freedoms that come with Western culture. The dollars spent by Western nations on cheap oil from the Middle East was recycled by the Saudis and other adherents of Wahhabi Islam around the Persian Gulf and used to fund mosques throughout Europe and North America. These mosques spread Wahhabi Islam, one of the strictest and least tolerant forms of Islam, across the West and throughout the Islamic world, replacing moderate and liberal forms that had arisen in the centuries after Mohammad’s conquering of the Arabian peninsula and nearby Levant. This “replacement” was often violent in places (e.g. in Pakistan, Thailand, Egypt) where internecine strife broke out between Wahhabi Sunni’s and followers of other Sunni sects or Shi’a, but happened quietly in the West, as other forms of Islam simply couldn’t compete with Saudi money to gain converts.

It is this toxic combination that Geert Wilders and his supporters recognize as a threat to their freedom, and by choosing to make a stand against it Wilders and those like him have found themselves condemned by the Left and hunted by Islamists. Their voices are few, but sound an alarm that warns the return to Europe of another of its default states: war.

When Pets Have Pets

A few years ago my young son dragged home a stray black cat. It had been tormented by the neighborhood kids, so my son caught it and brought it home figuring that his parents could either care for it or find it a new home. It was a healthy, young female cat, not the friendliest at first but she did warm up to us quickly so we I after a few weeks I made arrangements to have her spayed. By the time of the appointment I noticed that her belly had gotten bigger, and on the ride to the low-cost spay/neuter clinic I realized that she had not gone into heat but had gotten a lot friendlier. Ours is a pro-life house, and it applies even to kittens, so I turned the car around, returned home, canceled the appointment and waited. A few weeks later she gave birth to a litter of four kittens on my son’s bed.

I had never had kittens from day one, but I was determined to raise them into perfect house cats. After a few days I started handling them and had the family spend time with them. As they grew I supplemented their mom’s milk with expensive soft food formulated just for kittens. We brought them into our living room, onto our sofas, and even onto our beds, trying to imprint them with an affection for humans.

Fast forward a few years, and all four kittens are four large cats in my house. One of the cats, the only non-black cat in the lot, is a friendly, well-behaved cat that spends time with us as we move about the house. The other three cats have to be captured to be petted; one grooms himself furiously after being petted, desperate to remove our scent from his fur. The other two will meow at us when they need something, like when their food dish or water bowl is empty but otherwise I rarely see them.

In fact the only time I see them is when the dogs come in from the outside. The cats greet the dogs, padding in and rubbing themselves against them. Sometimes the dogs chase them away, other times they tolerate it, but for all the effort I expended on creating the perfect house cat I realized I have succeeded: I have created the perfect house cat – for a dog.

The Council Has Spoken: May 18, 2012

Congratulations to this week’s winners.

Council: The Noisy RoomThe Fascist Boogie Fever

Noncouncil:  Mark Steyn- The Spirit of Geert Wilders

Full voting here.

Student Debt at Colleges and Universities 2004-2010

One of the best graphics I’ve seen regarding debt and cost of attendance. Animated to boot.

The Myth of an Unbiased Media

What is a double-blind, placebo controlled study?

A double-blind, placebo controlled study follows a specific set of procedures to ensure that the results obtained are dependable and free from subjective bias. It is considered the ‘gold standard’ of clinical research studies.

Until the study is complete, neither the study researchers nor the participants know who received the study test substance, and who received an identical dummy substance, called a placebo. This ‘blindness’ ensures that the personal beliefs and expectations of either the researchers or the study subjects do not undermine the objectivity of the results. [emph. add]Baylor College of Medicine

Recently I was on the phone with a colleague I work very closely with, a woman around my age who lives in Los Angeles. She’s witty, very intelligent, spirited, and a die-hard Jewish progressive who seemed surprised to hear that I watched Fox News. I reminded her that I was a libertarian, and Fox News is the most sympathetic news channel to libertarians, but she said Fox News was so biased it was “right wing propaganda.”

Later that day on Facebook a Japanese non-profit organization I follow posted a link to a Fox News article that mentioned the possible extinction of the Japanese people by the year 3011 due to declining birth rates. Japan’s declining birth rate and aging society is nothing new and has been predicted for decades. The causes of this are numerous: working women, high cost of living, high cost of raising children in Japan, easy access to abortion, and even porn. The source of the article was a report by Tohoku University in Japan that extrapolated the current birth rate forward and came up with the year 3011 for the demise of the Japanese people. Instead of discussing the article, however, several posters took issue where the article appeared: Fox News. One commenter wrote, “... it’s Fox news. I can’t expect anything newsworthy from them…” Another wrote, “I would prefer some more solid evidence. And the extinction of Japanese is nothing more than laughable BS. Fox news, you need to do better than scaring peoples like that. Time to do homework.” Finally, another wrote, “LOL FOX news once again succeeding at rotting feeble american [sic] brains.”

Hating Fox News is almost as much a central plank of American progressives as unrestricted access to abortion that progressives don’t even question this belief anymore. They support billionaire George Judenrat Soros’s effort to shut down the 100% privately funded news station while encouraging government support of NPR, a publicly funded organization which, even New York Times reporter David Carr and the Columbia Journalism Review admit, leans to the Left. One could argue that it should be possible for any intelligent person to separate their ideology and personal beliefs from their job and report in an unbiased fashion, and they would be wrong for the very reason that double blind studies are the gold standard in research.

Double blind studies were developed as analysis of study participants found that those administering a particular test were statistically likely to influence the results simply by knowing whether they were administering a placebo or the actual drug even without consciously intending to. Such bias occurred on the unconscious level, and a double blind study where the person administering the drug did not know whether she was giving the compound being tested or the placebo was the only way to rule out this bias.

So why should we expect journalists, liberal or conservative, to report completely objectively about news stories and events?

We shouldn’t and more importantly we should give up on the concept of unbiased news reporting completely, spreading only through the consolidation of newspapers by large corporations and the requirement of a journalism degree from an accredited journalism school to be hired as a reporter by them. For example, in 1886 my hometown of St. Louis had 18 active newspapers. In 1986 it became a one-newspaper town when the St. Louis Globe Democrat ceased publication. Towards the end of its life it shared the market with the St. Louis Post Dispatch, and while the latter appealed to Democrats the Globe was read widely by the city’s Republicans. When the Globe went out of business, the owners of the Post Dispatch promised that the Post would moderate its stances and broaden its appeal to former Globe readers, and for the first few years after the merger, the Post attempted to do so by adding columnists and reporters from the Globe. But it wasn’t too long before the Post resumed it’s leftward slant, leaving St. Louis without a local conservative newspaper.

Had this happened a century before it would have been a disaster for the city. However it came at a time when people had begun getting their news from a variety of news sources including TV, national editions of newspapers such as the New York Times and Washington Post, national broadsheets like USA Today,  magazines, and talk radio. A few years later the Internet came online, and everything changed. The explosion of web sites catering to every ideology no matter how fragmented or extreme could not have been imagined by the Founder Fathers 220 years before. Suddenly Republicans in St. Louis or anywhere for that matter had access to the Washington Times, New York Daily News, National Review Online, and aggregation sites such as Drudge Report, RedState, FreeRepublic, Powerline, and of course, Instapundit.  It didn’t matter if St. Louis Post Dispatch leaned left when it’s bias could be countered by facts easily found in other sources.

That doesn’t mean that these other sources can be trusted 100% just because they agree with your opinion, though. I might watch Fox News but I watch it critically, just as I read the New York Times but do so aware of the liberal bias. At a time when we are bombarded by facts coming from an assortment of sources each with their own bias and agenda it is more important than ever to read, listen and watch critically. While it may take some effort up front, it also exposes one to new ideas and perspectives that could be missed if one stayed in a cocoon of  opinions that perfectly matched one’s own. Such diversity of opinions and perspectives allow a critical thinker to construct a philosophy or  world view that is much more complex, robust, and ultimately accurate than any contrived through a single ideological filter whether liberal or conservative.

Bias isn’t a notion to be fought; it’s one to be recognized. Survivalists know that in the wild a trekker will favor one foot over the other making it impossible to walk a straight line. To account for the bias one realizes it exists then deals with it by carrying a compass or through techniques such as using a marker or a succession of markers to help move in a consistent direction. In the intellectual sphere there aren’t such markers but there is the compass of critical thought that works just as well to help us evaluate the source of information and discover biases, either hidden or explicit.

Bias exists, and anyone who believes that their favorite newspaper or website isn’t biased is just as deluded as those they claim who read or watch a “biased” news source are.

 

Did Elizabeth Warren Steal from a Minority?

Every white person below the age of 50 or so understands why Elizabeth Warren lied about being a minority. Every college applicant forced to tick the “caucasian/white” box on an entrance application, every job applicant forced to do the same knows the cost of that mark, how the color of your skin determines whether or not your application is accepted or you get the interview. It’s hard to fight affirmative action quotas designed to redress racial discrimination decades or even centuries before you were born, so if you can’t beat the system, why not play it?

And that’s exactly what Warren did. Warren became a minority because it benefited her career. It made her stand out from all the other white faculty members at the University of Pennsylvania. More importantly it carried a certain cache important to a monied liberal like herself working with minorities in Philadelphia; a little street cred with her students that perhaps encouraged them to relate more to her. Sure she may have been playing fast and loose with the System, but that’s what everybody does, right? Besides, it was for a worthy cause: the minority students whom she would mentor and lead to better lives.

What Warren never considered was how playing the race card at UPENN may have kept a real minority out of the position. Faculty positions are limited so she beat someone out of the job. Faced with a real minority and a fake one like Warren, it’s possible that she was chosen for the position because she was blonde and blue-eyed and a minority as well. It’s the best of both worlds to some university administrators: a woman they can feel comfortable with who shares their same culture and upbringing yet a minority that they can add to their diversity statistics.

A lot of white people have stretched the truth in the cut throat world of academia. The only difference is that most of them don’t run for the US Senate a quarter century later and get caught.

The Council Has Spoken: May 11, 2012

Congratulations to this week’s winners.

Council: JoshuapunditInside Vladimir Putin

Noncouncil: Le·gal In·sur·rec·tion- Cruel irony in Elizabeth Warren’s Cherokee saga

Full voting here.

Bret Stephens’ Advice to the Class of 2012

Wall Street Journal columnist Bret Stephens provides free advice to the Class of 2012:

Many of you have been reared on the cliché that the purpose of education isn’t to stuff your head with facts but to teach you how to think. Wrong. I routinely interview college students, mostly from top schools, and I notice that their brains are like old maps, with lots of blank spaces for the uncharted terrain. It’s not that they lack for motivation or IQ. It’s that they can’t connect the dots when they don’t know where the dots are in the first place.

Sometimes the best advice is that which you don’t want to hear. If that’s the case then the Class of 2012 – and future classes and their parents, should read the entire thing here.

Of Goats and Politics

Over the weekend I attended an open house at an organic farm specializing in making goat cheese. Since I live on a large inactive farm I’m interested in learning about all aspects of small scale farming, and having grown up in the St. Louis suburbs there’s much to study. As I have learned more about growing things, I’ve come to appreciate organic methods that minimize or eliminate chemicals and work with the forces present in nature in order to grow food. Don’t get me wrong: Mother Nature will starve you to death and dine on your bones if you let her, but there are strategies such as avoiding monoculture plantings and pesticides that whack beneficial insects as well as pests that are worth pursuing for a hobby farmer such as myself. Additionally I’m becoming more aware of the sourcing of my food, recognizing that we have completely lost the ability to eat what’s in season when at the local supermarket we can buy strawberries in November and whole ear corn in January. I live among farmers, and I have seen the gradual creep of large agribusiness and the depopulation of rural America. Neither are good omens for our nation’s future, and though they may be inevitable, I’ll be damned if I contribute to the process. So I’m gradually buying more locally, and the trip to the farm open house was a way to get some ideas on my new lifestyle.

When we arrived the place was hopping, with young men directing people to park on a newly-mowed hay field. We parked, and as I walked past the cars I automatically scanned the bumper stickers, a bit of a habit of mine. The first one I saw as expected was an Obama ‘08 sticker, but the next one I saw surprised me: a Gadsden flag of the Tea Party along with a sticker that read “God Bless Our Military, Especially Our Snipers.” North Carolina is much bluer than I expected when I moved down here, and I’ve learned that while I might live in a predominantly conservative part of the state it is full to the brim with people of all political philosophies and walks of life.

All were represented at the organic farm. There were gay couples and old hippies, as well as clean-cut military men and their families, their kids petting goats and chasing free range chickens. A man dressed in a checked shirt beneath blue overalls stood alongside a young woman with more piercings than a rural stop sign, listening to one of the founders of the farm talk about its history and how it has grown over the years. Hispanics mingled with blacks who in turn stood in line with monied white suburbanites and their kids to take a turn at the pottery wheel and throw their own pot. Smiles were everywhere, and the place seemed as alive as the show hive of bees that stood on saw horses in the middle of a vegetable patch.

I was an odd child growing up. Some of my first memories are not of clowns or birthdays but of political events. I watched Nixon’s visit to Beijing broadcast on network TV in 1972. Two years later I rushed home from school and flipped on the Watergate hearings instead of game shows or cartoons. I grew up living and loving politics, and had I been born with a more gregarious personality I would have pursued a career in it. Instead I was socially inept, perhaps even autistic, so politics could never be more than a spectator sport for me, but that didn’t stop me from enjoying it.

But I’ve lost that joy. It has been years since I felt something other than doom and dread about politics, and the organic farm reminded me why.

We are divided, almost atomized these days. It has been years since we felt unity, the last time being the unity of grief by the 9-11 attacks. Since then our leaders have failed us. President Bush famously promised to be a “uniter not a divider”, but then went and did what he wanted to do in Iraq and in the biggest failure of his administration, presided over an explosion of government and spending. The Department of Homeland Security wasn’t a Clinton creation, it was a Bush one after all. While I agreed with his policies in Iraq at the time, Bush failed to support his actions at home against his critics. He just did what he wanted because he knew it was right, but didn’t even try to convince people otherwise.

Obama hasn’t even attempted to unite us. He took office reminding Republicans that he won and has governed accordingly, ramming through his signature health care legislation without a single Republican vote. A year later Americans clipped his power by taking away the House from the Democrats and ending their filibuster-proof majority in the Senate, but Obama didn’t miss a beat. Instead of moving to the center and working with the opposition to get legislation passed, he went to the extreme, and decided to wait things out to the next election, blaming the GOP and his Republican predecessor for the fruits of his own failure to lead.

Leadership in a democracy requires skills in the art of compromise. It’s hard to imagine but Ronald Reagan whom even Obama himself has claimed for his own never had a friendly majority in the House during his 8 years yet managed to pass budgets and legislation with bipartisan support with no less a political mastermind like Speaker of the House Tip O’Neill. We have yet to have a single budget from the president even during the 2 years his own party held both the House and Senate.

In fairness to Obama he never was much of a leader. His career reflects the Peter Principle more than the exercising of leadership skills to make it to the top, always having a mentor in higher position who can push him further up the political ladder. Unfortunately Obama now finds himself at the top with no mentor other than his usual billionaire friends like George Judenrat Soros and Warren Buffet. While these men may support him with their financial acumen and deep pockets, there is no one above Obama that can protect him anymore so he must rely on his skills. The problem is that the process that led to his ascension to the highest office in the land avoided cultivating those skills.

George W. Bush had a similar rise through the ranks, although based on his name rather than mentors. Samuel P. Bush, George W’s great-grandfather, built a successful career as an industrialist and dabbled in politics during World War I. His son Prescott continued the path of mixing success in business with politics that lead to George Bush’s ascendance to the presidency in 1988. While George W. Bush showed the ability of a leader to make difficult decisions such as to attack Afghanistan in 2001 and Iraq in 2003, an upbringing where his name alone opened doors and convinced people made it unlikely that he would develop other leadership skills such as the ability to convince others and charm one’s opponents.

The last president that had such leadership skills? Bill Clinton. Clinton is a self-made man and rose through the political ranks solely on his wit and charm. During his 8 years in office Clinton was able to pass budgets and bipartisan legislation with die-hard partisans such as Newt Gingrich. Clinton understood how to work with Congress, and his domestic policy record proves it (on the other hand his foreign policy record was in retrospect a disaster, consciously ignoring the threat posed by al Qaeda even though numerous terrorist attacks occurred on his watch.)

We have gone 11 years with weak leadership and our nation has suffered. You can’t compromise with someone you call a racist. You can’t cut deals with a party you demonize as misogynistic and homophobic. Leadership doesn’t pit one group of people against another; it fuses them together in a shared purpose.

A true leader does more than call his opponents names and make grand promises in eloquently delivered speeches from teleprompters. He inspires but also delivers on his promises. He doesn’t hold grudges but also makes it clear that he will not be played the fool. He understands the responsibility that comes with his position and serves all the people, not just those who voted for him. Most importantly he appreciates and respects the ideals that bind us together as a people and a nation, recognizing that while we might disagree vehemently on issues big and small, we are all bound by the love of freedom and hope for a better future for our children and our country.

While it is clear that leader is not Obama, neither is it clear that it is Romney. But I do wish that both men could have taken a moment from their politicking to talk to the farmer selling hand raised beef, watched the Montagnard women weaving brightly colored fabrics, and tasted the red pepper goat cheese. Perhaps they would have understood that if we could put aside our differences at a goat farm founded by a woman driving around with two goats in the front seat of her Toyota looking for a farm in North Carolina, we are a people ready to be led, and who deserve a good leader.

The Council Has Spoken: May 4, 2012

Congratulations to this week’s winners.

Council: The Noisy RoomPathological Politics – Predatory Partners and Persecuting Patriots

Noncouncil: City Journal/Joel Kotkin - The New Class Warfare

Full voting here.

Marriage and the State: It’s Time For a Divorce

After a lifetime of fighting debilitating shyness and social anxiety I have found a life that permits me to avoid human contact except on the rare occasions when I initiate it. Modern technology is perfect for people like me. I can be social without actually being social, leaving me to focus on what people are doing or saying without worrying or thinking about myself. Facebook has become a useful tool to keep tabs on memes floating around groups one usually no longer associates with. Since most of my friends are leftists of various stripes I watch as they share posts that are supposed to change the world. Most of the time I let these slide without comment since I understand that they lack a blog like this one to share their political thoughts and so are limited to Facebook posts.

Sometimes I slip.

A very good friend of mine shared a post that read, “Like, if you are a supporter of same-sex marriage. Share if you aren’t afraid to admit it.”

As a libertarian I have been a consistent supporter of the so-called “gay agenda” for decades because I’ve been around gays most of my adult life and I simply don’t see how one can support small government yet demand that it poke it’s bureaucratic head into the bedroom. Honestly I want to see the government completely out of the marriage business, and leave the sacrament up to religions to administer as they see fit.

But is this really necessary?

Changing people’s minds requires more work than sharing political messages among friends on Facebook. If I opposed gay marriage it is highly unlikely a Facebook post would change my mind. In fact, sharing a photo or message does very little because it’s preaching to the choir: how many of one’s friends posting this entreaty really AREN’T supporters of same sex marriage?

On May 8th North Carolina is voting to amend the state constitution to ban civil unions, domestic partnerships and other types of domestic legal unions, specifying marriage as the sole legal union between a man and a woman. I think this is stupid on so many levels that it makes me spit. Not only does it discriminate against gays and lesbians, it discriminates against straight, non-religious people who are committed to each other but view marriage as a religious vow. So instead of sharing the photo, I’m going to drive 15 minutes to the polling station, wait in line for probably another 15-30 minutes, and cast my vote on this one issue AGAINST a stupid law. That’s an hour of my time I’d rather waste doing something else but instead I’m going to vote. Given the opposition to gay marriage in my community where Baptist churches outnumber gas stations and fast food restaurants, there’s a good chance my vote will be only one of a very few opposing the measure.

I am a secularist. It’s a word that’s often misunderstood and abused by the religious minded and those who hate them. A secularist is not anti-religion. He or she is someone who believes there is a line between the sacred and the profane, and on one side there is religion, and the other politics. A secularist cares just as much when a religion is forced by the state to obey a law that undermines its core beliefs, as when a religion attempts to force its beliefs on the state. A secularist believes that both entities have their spheres in modern life, and trouble comes when they rub together.

The First Amendment of the US Constitution states, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.” This has come to be interpreted as the separation of Church and State put forth by Thomas Jefferson in a letter to the Danbury Baptist Association in 1802 in which Jefferson wrote, “I contemplate with sovereign reverence that act of the whole American people which declared that their legislature should ‘make no law respecting an establishment of religion, or prohibiting the free exercise thereof,’ thus building a wall of separation between Church & State.”

Secularists can trace this doctrine back even further, to Jesus Christ’s answer to the Pharisees seeking to entrap him. “Then went the Pharisees, and took counsel how they might entangle him in his talk. And they sent out unto him their disciples with the Herodians, saying, Master, we know that thou art true, and teachest the way of God in truth, neither carest thou for any man: for thou regardest not the person of men. Tell us therefore, What thinkest thou? Is it lawful to give tribute unto Caesar, or not? But Jesus perceived their wickedness, and said, Why tempt ye me, ye hypocrites? Shew me the tribute money. And they brought unto him a penny. And he saith unto them, Whose is this image and superscription? They say unto him, Caesar’s. Then saith he unto them, Render therefore unto Caesar the things which are Caesar’s; and unto God the things that are God’s.When they had heard these words, they marvelled, and left him, and went their way.” Matthew 22:15-22. This doctrine was later expanded upon by St. Augustine writing four centuries later noting  the differences between an “earthly city” and the “City of God.” Martin Luther took St. Augustine’s ideas even further in his Doctrine of the Two Kingdoms which postulated that God worked his will through secular institutions as well as through divine acts. Luther also promoted secularism in his book “On Secular Authority,” writing that a government could not force spiritual beliefs on someone because such beliefs would be held insincerely and would therefore be invalid in God’s eyes. Luther’s ideas would then be picked up by John Calvin and other Protestant reformers, and later James Madison and Thomas Jefferson in the United States.

Even with a relatively clear and consistent philosophical lineage the United States has struggled with the concept of separation of Church and State almost since its inception. For the first hundred years of the Republic the First Amendment was viewed as applying specifically to the federal government; states were free establish official religions. Massachusetts supported Congregationalism until 1833. States continued supporting religion by enacting Blue Laws, abiding by religious holidays and providing other public concessions to religious groups. The Supreme Court finally began to weigh in on the issue, ruling in Reynolds v. United States (1878) that state laws prohibiting bigamy trumped religious laws (Mormonism in this case) that allowed it. It banned school prayer in public schools in its rulings in Engel v. Vitale (1962) and Abington School District v. Schempp (1963). Since then the Supreme Court has delineated a distinct line between religion and secular society. Nevertheless that line continues to be defined by lawsuits challenging the legality of public religious displays and the wearing of religious head coverings on the job, and the rise of gay rights requires further definition.

Marriage has been a component of government since Ancient Greece when Solon wrote a series of laws covering all aspects of daily life including marriage. Since marriage between men and women resulted in children, and children were necessary for the continuation of the State, the State took an early interest in marriage, an interest that continued through the centuries to the present. For most of history, God and the State were one in the same, and the idea of separating the two made little sense. It wasn’t until the modern era that the concept of marriage without the State could be imagined, but even today in states across the country one must acquire a marriage license from the state and have a religious ceremony conducted to make the contract binding. There is no other civil agreement that requires a cleric’s signature.

From a civic standpoint, marriage makes sense. It legitimizes property ownership and distribution. It tames young men and lays the foundation for the means to support children. It pools wealth. Studies continue to show that children from an intact marriage do better in school, and that on average a pair of married people are wealthier than two singles. But these benefits to society will not go away if the state gets out of the marriage business.

America continues to be a country of the religious. According to a Pew 2007 study only 16% of Americans claimed no religious affiliation. Marriage will not disappear. Instead it will fall under the complete control of religious authorities who can marry whomever they wish as they see fit. If a Protestant sect sanctions gay weddings, fine – but Baptists, Catholics and Muslims can forbid such vows without fear of persecution by the state. Separation of church and state cuts both ways, after all, and leaving marriage to the religions creates a barrier to prevent state meddling in religious beliefs.

What about the distribution of property? There’s already a document for that: a will. There are plenty of other existing legal documents that can be used to handle other situations usually covered in a blanket fashion by a marriage certificate like power of attorney and articles of incorporation. These documents can protect a pair (or more) of people regardless of sex, and treats them as equals before the law, something that existing law does not.

Disentangling marriage from the state and undoing 2,500 years of custom will not happen overnight, nor will sharing posts supporting the idea on Facebook change anything. But it is worth considering as the ultimate solution to the gay marriage issue and weakens the war on Christianity that gay marriage supporters unleash in response.