36 items found for your search. If no results were found please broaden your search.
(04/27/05 12:00pm)
There are essentially two paths to success in the music world. The first entails repeating a winning formula until audiences tire of it. The second requires breaking new ground in hopes that innovation will catch listeners' attention. The latter path is more challenging but, for some bands, is also more rewarding.
Queens of the Stone Age is one such band - a relentlessly creative outfit that revels in ignoring popular trends and creating its own. Born from the remains of the metal band Kyuss, Queens of the Stone Age has yet to take a conventional approach to any aspect of music.
Each of the band's three previous releases featured a different sonic theme and a rotating gallery of members. From the Western-tinged self-titled debut album, to the druggy, mellow "R," to the blissfully fuzzy, gimmick-laden "Songs For the Death," the only constant has been vocalist/guitarist/songwriter Josh Homme.
On the band's latest release, "Lullabies to Paralyze," Homme reinvented Queens of the Stone Age once again. The most recent incarnation follows the departures of bassist/founding member Nick Oliveri and drummer Dave Grohl (of Foo Fighters fame). Losses such as these would bring a normal band to the brink of ruination, but the talented Homme has found a way to persevere.
If anything, these departures have allowed Homme to take on a greater creative role. The result is that "Lullabies to Paralyze" is a more polished album than its predecessors. Granted, there are plenty of bizarre flourishes, but the sheer contempt for the audience shown on previous releases is gone. Homme, along with new band mates Joey Castillo (drums) and Troy Van Leewan (bass), has managed to mate the tripped-out melodies of "R" with the fuzziness of "Songs For the Death" and the metallic aggression of Kyuss. The result is a sound that is at once dark, visceral, beautifully haunting and entirely new.
"Lullabies to Paralyze" is not elevator music - it actively engages the listener. The opening melody, "This Lullaby" (featuring vocals from guest contributor Mark Lanegan), creates a lulling sense of faux placidity. By the middle of the next track, "Medication," this illusion is completely blown away as Homme and crew pound out art rock with energy to spare.
These contrasts persist throughout the album. The thudding bass lines of "Burn the Witch" are offset by the laidback, carefree ethos of "Tangled Up in Plaid." "Little Sister"'s complex, catchy rhythm melts into the slower, understated accusations of "I Never Came." All the elements come together on "Someone's In The Wolf," a seven-minute stomp that revels in repetition. An instrumental hidden track evokes images of grandeur and brings the album to a fitting conclusion.
"Lullabies" is by no means a perfect album. Its sheer sonic diversity is likely to alienate/confuse/scare those not already familiar with Queens of the Stone Age. Even the band's faithful fans will likely find the latter half lagging - it moves at a much slower pace than the first handful of tracks.
Nevertheless, there are enough rewards to reap for those who are willing to endure the album in its mesmerizing entirety. Homme's guitar is as sharp as ever, his voice has improved and the lyrical content is wonderfully warped and pleasantly psychedelic.
By the album's conclusion, the listener is literally paralyzed, left with an unerring sense of "Whoa."
Queens of the Stone Age has taken a gamble here. "Songs For the Deaf" brought them perilously close to mainstream success. Rather than embrace it, the band has chosen to self-destruct and build anew. Inevitably, this has left some to conclude years of casual drug abuse have finally turned Homme's brain to mush. But if the past (to say nothing of the new album's repertoire) is any indication, he has another unlikely winner on his hands.
(04/27/05 12:00pm)
In "I Want to Conquer the World," veteran punk-rockers Bad Religion ask "is your fecundity a trammel or a treasure?" In my penultimate column, I shall attempt to answer this question with regard to producing written works.
Writing ultimately affords each of us a rope. Whether we use that rope to climb a mountain or to hang ourselves is largely within our discretion.
Those of us who choose to climb, however, would probably do well to realize that there are a few principles that should be heeded if we hope to reach the top.
Writing and thinking go hand in hand. If you aren't prepared to analyze, criticize, reinvent or revisit your work, then by all means do not write. The notion that writing mystically springs "from the heart" is a malicious fabrication. Even that which is heartfelt and emotional in origin still undergoes an editing/filtering process between first draft and publication.
Often, this is done to save us from our own stupidity. Anyone can make an impassioned rant in the heat of the moment. Said rant is unlikely to survive scrutiny, however, after the moment has passed.
Next, writing is not speaking. There are a number of visual and auditory cues that we can utilize to convey meaning in casual conversation. These cues become lost when we make the translation to text. Writing therefore requires a tone that reflects the author's true intentions.
This level of precision is not always easy to master. I took some criticism for last semester's "Hookerific" column because I did not make it clear which parts of the piece were satirical and which parts were meant to be taken seriously.
As such, a misunderstanding ensued and the end result was not what I intended.
Had I been paying more attention, I'd have realized that writing does not allow us to clear our throats for emphasis or say certain things with a sly smile.
Lastly, writing requires us to actually want to write.
Often in academia, writing is borne from necessity rather than desire. No one of sane mind and body actually relishes the thought of putting together a 12-page paper.
Even still, we should at least be willing to commit ourselves to getting the job done. When writing is approached with a hostile or apathetic attitude, that hostility or apathy manifests itself in the work and diminishes its quality.
We might dislike writing papers, but we dislike getting lousy grades on papers we've written even more.
Depending on how we approach it, writing can either be a trammel to hold us back or a treasure that allows us to reach our fullest potential.
The most capable among us can easily be reduced to mounds of blabbering idiocy if they lack the fundamentals of effective written communication.
Similarly, the charlatans, the frauds and the know-nothings can con the best of us if they acquit themselves well in print.
Regardless of what field we enter or how often we actually need to do it, we should all know how to write.
Listening to Bad Religion doesn't hurt either.
(04/27/05 12:00pm)
I came to the College in Fall 2001 knowing, more or less, that I could write and seeking a degree that affirmed as much. In a few short weeks, I shall be receiving that degree. If this was the sum of my college experiences, I'd be satisfied. However, it turns out that I've gained quite a bit more.
High school was, by and large, four years of mediocrity (three of which were spent in varying stages of depression). From the monotony of the daily grind to the needlessly cruel and complex social hierarchy (damn cliques), its only saving grace seemed to be that it ended and paved the way for things to come.
College, on the other hand, came close to restoring my faith in humanity. Whereas high school was restrictive, college afforded me a great deal of independence. Whereas high school was confrontational, college allowed me to simply ignore people I didn't like.
And, whereas high school left a distinctly bitter taste in my mouth, I can at least say there were a few memories of college worth keeping.
During my time here, I've transitioned from shell-shocked obscurity to a certain level of visibility (all while sacrificing none of my quirkiness). I'm currently a member of four campus organizations and am in a leadership/executive board position within all four.
College affords everyone chances such as these. If I can capitalize on them, then there is no reason why anyone else (on-campus students especially) can't.
College also provides us with an opportunity for self-discovery. Usually, this takes on a far more subtle form than the life-altering clich?s some of us have come to expect and fear.
Nevertheless, it should be said that this is the time for experimentation. Even if you like the identity you have, there is nothing to be lost by putting it to the test every once in a while.
Finally, college is usually the first place you can expect to begin doing what you want to do for life in a meaningful way. The first steps on career paths are taken here and lifelong journeys/odysseys/obsessions are begun.
For me, this expression revolves around writing. During my time here, I've not only improved as a fiction writer (one of my stated goals), but also took a liking to journalism. There were some doubts at first, but sticking with it has proven to be quite rewarding.
Conversely, I know quite a few biology and chemistry majors who are grateful they switched to fields that they ultimately found more satisfying.
I've done a lot of antagonizing during my tenure as opinions editor. I can take pride in the fact that I've pissed off conservatives, progressives, radical feminists and fundamentalist Christians alike.
If I can get that many people to agree on something (a distaste for my views), I must be doing something right.
In that vein, I would like to thank my columnists for their contributions.
Even when they are dead wrong and their arguments are spurious (at best), they have succeeded in breaking down the barrier of ennui and sparking a debate.
Besides, it's often more challenging to voice an unpopular view than it is to go along with the status quo.
Thus, on that note, I adjourn and bid farewell to the following: The Signal's hideous office walls (and not-so-hideous editors), The Siren's incredible shrinking budget and staff, Bob Cole's Southern witticisms, housing lottery controversies, Sodexho scandals, ink, Sigma Tau Delta, walking to Bliss Hall in the rain, reading at "the goods," Direct Connect, never-ending construction and everyone and everything else that has made this worthwhile.
And remember: I may no longer be writing on these pages, but that's no reason for you cease reading. Question, think, react, respond or else quit wasting precious oxygen.
(04/20/05 12:00pm)
Those zany left wing polemicists say the darndest things. Just as Noam Chomsky once famously declared that Richard Nixon was "the last liberal president," Michael Moore wrote that Bill Clinton was "a good Republican president."
Both claims contain a surprising amount of validity. Nixon was a moderate conservative in an excessively liberal era; ergo, his policies can be considered liberal in light of the shift toward conservativism that followed.
Similarly, Clinton defined himself vis-?-vis his Republican predecessors, namely Ronald Reagan.
Merely including Clinton and Reagan in the same sentence may seem blasphemous to some. Prior to the ascension of George W. Bush, these two were the most divisive presidents of our lifetime (both progressives and conservatives were lukewarm on Bush's father).
Depending on one's political affiliation, one man represented all that was right with politics and the other all that was wrong.
Yet these highly partisan reactions ignore the fundamental similarity between the two and the hard-to-miss observation that Clinton never would have been able to do the things he did had Reagan not cleared the path for him.
It is important to note that both Reagan and Clinton came to power during tough times. Both men were the embodiment of change in their respective eras.
As such, both were credited with improvements that took place during their terms, whether they deserved the credit or not.
Circa 1980, America was not in good shape. The presidencies of Lyndon Johnson and Richard Nixon had all but eliminated any faith that could be found in the federal government.
Failed domestic policies (largely Democratic in origin) brought the economy to the point of stagnation (or, more accurately, stagflation). Militant Iranians were holding 66 Americans hostage and President Jimmy Carter was unable to bring them home safely.
In light of these circumstances, Reagan campaigned on change. He became the poster boy for cutting taxes, reducing the size of government and restoring lost confidence. He capitalized on dissatisfaction with the present and used it to earn a decisive victory over Carter.
We now know, of course, that very few of the ideas that Reagan embraced and took credit for were actually his own. The supply side economic theories that have become synonymous with his name were developed in the 1970s by economists like Arthur Laffer and Milton Friedman.
Tax reduction and reform, some other Reagan "initiatives," were actually labors of love for Rep. Jack Kemp (R-N.Y.), who advocated them for years before sponsoring the Economic Recovery Tax Act of 1981.
Further, the Tax Reform Act of 1986 would have never gotten off the ground had it not been for the sponsorship of Democrats Bill Bradley and Dick Gephardt.
Despite having little to do with creating economic policy, Reagan was able to take credit for it because he supported it vocally and vociferously. This would set a precedent for Clinton to follow.
Flash forward to 1992. America was better off than it was in 1980, but not by a wide margin. Citizens were tired of huge deficits, stock scandals and the Washington insider culture that produced Iran-Contra. Into this period of discontent stepped William Jefferson Clinton, an "outsider" from Arkansas who campaigned on deficit reduction and middle class values.
Just as Reagan capitalized upon dissatisfaction with the incumbent, Clinton used America's distaste for - and distrust of - George H.W. Bush to secure the presidency.
Like Reagan, Clinton was able to take credit for much of the economic prosperity of his era, despite not being the cause of most of it.
While he showed dedication to reducing the deficit, he did so at the expense of a planned middle class tax cut. He also was elected at a time in which the economy was already in the early stages of a period of growth that would extend through several years of his administration.
Lastly, the spending cuts and government reduction (welfare reform et al.) that made his presidency economically prosperous didn't occur until after Republicans took control of the House of Representatives in 1994.
However, because Clinton followed Reagan's lead in championing whatever policy was likely to make him look good, he was able to absorb praise for the economic gains of his tenure.
Reagan also showed Clinton the importance of charisma. Dubbed "The Great Communicator," Reagan was an extremely gifted speaker. A former actor, he often came across as witty, knowledgeable and humane, even when he was lying through his teeth or didn't have a clue what he was talking about.
Rhetorical gems, such as his speech following the Challenger shuttle disaster, easily overcame gaffes like "Facts are stupid things." It is this charismatic ability that allowed Reagan to conquer his critics in the eyes of the public.
Clinton, a former law school professor, also learned the value of charismatic appeals. A talented orator, he was able to sound convincing regardless of what lies escaped his lips.
Blunders such as "I did not inhale" were swept aside by winning words like, "There is nothing wrong with America that cannot be cured by what is right with America." Clinton, like Reagan, became impervious to attack based on rhetorical strength alone.
Much of Clinton's controversial foreign policy had its roots in Reagan's foreign policy approach as well. Both presidents exhibited a remarkable short-sightedness in foreign policy vision that left a mess for future administrations to clean up.
The fiction of the Reagan administration is that the president defended liberty abroad and single-handedly brought down the evil Soviet Union. While Reagan often spoke about liberty, his foreign policy ran counter to it.
From arming Islamist terrorists in Afghanistan to backing Saddam Hussein in Iraq to supporting murderous Contras in Nicaragua and apartheid in South Africa, Reagan's foreign policy approach was nothing short of appalling. Even more spectacularly, he used tax dollars to subsidize a genocide in El Salvador then denied it (the infamous El Mozote massacre)!
Yet because his administration bullied the press, shrouded itself in secrecy and maintained a well-oiled propaganda machine, Reagan himself emerged with few scrapes and bruises.
Clinton, for his part, utilized an eerie sort of post-Reaganism in his foreign policy approach, simultaneously rejecting and espousing Reagan's "victory at all costs" ideology. Clinton's ill-fated foray into Somalia represents a rejection of Reaganism. In contrast to the brutal-self interest shown during Iran-Contra, the United States had nothing to gain by intervening in Somalia (save for repairing its reputation).
However, a lack of insight, planning and dedication on Clinton's part doomed this "humanitarian" military action.
More traditional Reaganist tactics came into play with Clinton's handling of the war in Kosovo. As was the case with Reagan, he made a conscious effort to shut out the media.
And, just as Reagan attempted to paint the Contras as good guys (he once drew an inexplicable connection between them and our Founding Fathers), Clinton engaged in some heinous spin when he tried to turn less than 3,000 deaths into a 100,000-casualty genocide.
Because Clinton utilized both Reaganist and anti-Reaganist approaches, he achieved few foreign policy gains. Whereas Reagan was merely malicious, Clinton was both malicious and inept, sealing his legacy as one of the most ineffective commanders-in-chief in recent history.
The final area in which Reagan set a precedent for Clinton to follow was damage control. Throughout his presidency (and perhaps even beforehand), Reagan was rocked by rumors of scandal.
In addition to the pass? Iran-Contra affair, Reagan took heat for the looting of the Department of Housing and Urban Development (HUD) and the appointment of James Watt. Among the more outrageous claims made were that he delayed the release of hostages in Iran for political gain (the October Surprise theory), had John Tower and John Heinze killed and raped Selene Walters in her home in 1952.
The Reagan administration's strategy for handling these claims was to use their viciousness as a point of attack against political opponents.
Reagan's designated spinners, Pat Buchanan and Larry Speakes, posited the idea that the very scandalous nature of these claims was only more evidence that the hostile media was out to get the president.
It was a tactic that was well received, as Reagan had a career average approval rating of 57 percent.
Even more so than Reagan, Clinton found himself embroiled in scandal. Whitewater and Monicagate were part of a long line of shenanigans that got Clinton in trouble.
Among the others were illegal campaign donations, last-minute pardons and yet another HUD scandal. Like Reagan, Clinton was a frequent target for conspiracy theorists. Claims surfaced that he had Ron Brown and Vincent Foster killed and raped Juanita Broaddrick.
Clinton, like Reagan, deflected these accusations by turning them against the accusers. The more Republicans accused Clinton of doing, the more "proof" it was that they were out to get him for purely partisan motives (hence, the vast rightwing conspiracy).
Also like Reagan, Clinton used this tactic to great success - he left office with one of the highest job approval ratings in recent memory.
Given that Clinton emulated Reagan in many ways, the Republican backlash against him is suspicious. As Matt Esposito alluded to last week, however, the way we interpret similar actions might be very different depending on the letter (D or R) that accompanies a person's name.
Moreover, it's possible that Republicans place a greater emphasis on character than their Democratic counterparts (though if this were true, how does one explain Tom DeLay or Newt Gingrich?).
Reagan led a mostly virtuous private life, serving in the military and excelling as a lifeguard before taking up acting. Clinton, in contrast, led a life that revolved around dalliances with women not named Hillary and shady business deals.
Thus, an argument could be made that if Clinton was "a good guy" like Reagan, he too would have been forgiven for his many trespasses.
Between the two of them, one thing is clear: we should exercise greater scrutiny in selecting our leaders. What we tolerate now, we cannot object to later.
Democrats have learned this lesson with the advent of the Wilsonesque Bush doctrine, and, one day, Republicans may find themselves faced with a Democratic president who acts a lot like a certain grammatically-challenged Texan.
(04/13/05 12:00pm)
Some people believe that comedy and craft go hand in hand. There is a notion that comedy requires thought, timing and precision. I, on the other hand, am of the opinion that if you keep your eyes open long enough, something funny will simply come along.
This latter approach netted me a wealth of laughs as I stumbled across the Web site of The College of New Jersey Republicans. The site contains a list of Republican principles, supposedly held by the organization's members and the leaders they bow down to ... er ... admire.
Here's the punch line - the nation's Republican leadership has significantly violated nearly every one.
For instance, the site proudly tells us, "We believe that government operates most effectively when it is closest to the people."
Yet anyone who has been keeping an eye on the Republican leadership lately knows this isn't true in practice. How is attempting to federalize the Schiavo case keeping the government close to the people? Did I miss something or was the Republican attempt to take gay marriage (a state issue) to the federal level a move away from the people rather than toward them?
The site goes on to explain that Republicans "believe in accountability, flexibility and local control for public schools."
Of course, if this were the case, the No Child Left Behind program would have never seen the light of day. The program weakens rather than strengthens local control by making schools responsible for meeting federal, rather than local, standards.
Libertarian Presidential candidate Michael Badnarik put it best when he said, "No matter how I read (the Constitution) - forward, backward, upside down or with my Captain Liberty Secret Decoder Ring - I can't find anything in it that empowers the federal government to be involved in education."
Perhaps the Republicans are using special glasses that enable them to read what the rest of us believe isn't there.
The most snicker-inducing claim the site has to offer is the idea that Republicans believe in "equal rights and opportunities for all."
Debunking this assertion would require a column all its own, so for the sake of expediency, I'll limit myself to one issue.
While many egalitarian Republicans do exist, it is clear that the neoconservative leadership does not believe in equal rights for those they suspect of having connections to terrorism.
This belief - exemplified in the Heritage Foundation's support for strengthening and expanding the USA PATRIOT Act - basically says that the only people who have anything to fear from the war on terror are terrorists, in which case they don't deserve the protection of the law to begin with.
Again for the sake of expediency, I shall refrain from pointing out all the people who have falsely been accused of being terrorists since this "war" began. I will, however, point out a glaring inconsistency between Republican rhetoric and Republican practice.
The rhetoric would have us believe all people are equal under the law. The same law that allows us to try criminal and punish them for their crimes affords them certain rights and protections, which cannot simply be waived because the crime in question is an act of terrorism (as opposed to a less onerous offense).
Furthermore, the law also operates under a presumption of innocence until guilt is proven. This means no one is a terrorist - and hence deserving of having his or her rights taken away - until he or she is convicted of an act of terrorism (after which point you can impose the death sentence with a smile for all I care).
"Equal rights and opportunities for all" means precisely that - it doesn't matter if John Q. Enron, upstanding citizen and churchgoer is on trial for illegal campaign contributions or Mohammed al Hussein bin Mohammed is on trial for blowing up a building and killing 23 people.
The same rights, protections progressives ... all while still trying to forge a populist image.
Comedy thrives in the wide gap that exists between practice and principle. The next time you want a good laugh, listen to a Republican try to quote Russell Kirk or Ronald Reagan with a straight face. Listen to Democrats attempt to evoke John F. Kennedy or Franklin Delano Roosevelt. Listen and ask yourself: can these people really be that oblivious?
(04/06/05 12:00pm)
Life is full of unexpected turns. For instance, I had considered writing about the Schiavo case this week, but it occurred to me that the Republican leadership's crass and hollow attempts to violate Florida law, threaten the separation of powers, conjure up phony medical "evidence," exploit the Schindlers' pain for their political gain and smear Michael Schiavo at all costs was so egregious and blatantly stomach-turning that it required no further elaboration on my part.
Instead, I find myself writing about a topic that has thus far been entirely ignored - the identity trap. Identity, be it race, gender, religion, sexual preference or political affiliation, has always been an ingrained part of our culture.
The rise of identity-specific works, however, seems to be a product of modernity. The Black Arts movement, the popularization of feminism and the rise of social constructivism (understanding traditional academic disciplines in a social context) all occurred within the past 50 years or so.
The explosion of identity-based approaches to the arts and sciences is a double-edged sword. On the one hand, the emergence of traditionally stifled perspectives provides us the opportunity to increase our understanding of the world around us.
Yet, at the same time, there is an inherent danger in that the "identity" element can easily overwhelm the work itself.
This latter phenomenon is certainly the case with rap music. Rap has legions of fans and supporters of all races and cultural backgrounds. No less a scholar than Cornel West has referred to it as "the last form of transcendence available to young black ghetto dwellers."
It has, in popular conception, become virtually synonymous with black urban culture, to the extent that a criticism of it is often interpreted as a criticism of the culture itself.
Because of this overwhelmingly strong identification, little attention is paid to rap as a musical, rather than a cultural, product. The genre's musical flaws - lack of originality, simplistic repetitiveness, poor vocal quality - are all but ignored. And yet rap artists are continually awarded Grammys and other accolades for their musical, rather than their cultural, contributions.
When Chuck D (of Public Enemy fame) visited the College two years ago, he conceded this point. "Rap is not music," he said. "It is vocalization on top of music."
With this in mind, there is no shame in hating rap (or, at the very least, current mainstream rap) simply because it isn't good music. This is not racist nor is it a reflection of racist musical standards. The same standards that denigrate rap celebrate the contributions of Jimi Hendrix, Robert Johnson and James Brown.
If rappers want to continue to be considered musicians, they should start making better music. Otherwise, they should brace themselves for the day when the world tires of its cultural fascination and realizes how artistically lacking the genre is.
The identity trap befalls not only artists, but critics as well. I can't count the number of CDs, books and movies that have been unjustly dismissed because the reviewer didn't agree with any message contained therein.
We are all entitled to our preferences, but anyone who goes about the serious study and evaluation of literature/music/film should look at it first as literature/music/film. Concerns regarding ideology and identity should be secondary.
To put it another way, can you imagine what the world would be like if we no one read "Romeo and Juliet" because it is "sexually suggestive" or watched "The Godfather" because of its alleged "negative Italian stereotypes"?
To some extent, arguing over the merit of music and movies is fruitless because so much of it is purely subjective. The same cannot be said for the sciences, on the other hand.
Scientific disciplines are generally guided by rules and principles that are far more concrete than literary theories. In this light, the triumph of identity over quality and merit in the sciences is inexcusable.
More than a decade ago, Charles Murray and Richard Hernstein wrote a book entitled "The Bell Curve." Due largely to the claims it made regarding race and intelligence, it caused quite a furor.
The rightwing punditry then went on to suggest the book was received with such hostility because liberal readers could not handle its scientific truth. In actuality, the inverse proved to be true - conservative pundits easily overlooked the book's scientific flaws because they identified with its conservative, politically incorrect message.
Thomas Sowell, himself an intellectual conservative, made the following criticism, "Perhaps the most intellectually troubling aspect of 'The Bell Curve' is the authors' uncritical approach to statistical correlations. One of the first things taught in introductory statistics is that correlation is not causation."
Identity-centered works are not, by their very nature, inferior or unworthy of being taken seriously. Ralph Ellison's "The Invisible Man" was very much about the early Civil Rights Era black experience (and was in that sense propelled by a black identity). It succeeds because of Ellison's skill as a writer. Radical leftwing populism drove Rage Against the Machine's recordings, yet Tom Morello's guitar prowess ensured the band's musical credibility.
It is only when identity overwhelms craft that the identity trap claims its victims.
(03/30/05 12:00pm)
About the only thing more insufferable than a tyrant is a well-intentioned fool. It pains me to see people who have noble goals (and the skills needed to reach them) shoot themselves in the feet by relying on faulty mechanics.
This pain has been felt very acutely these past four years for obvious reasons, but I digress. The well-intentioned fool has become a virtual mascot for do-gooders on the anti-capitalist left.
Anti-capitalists, be they social democrats, Naderite consumer advocates or unabashed Marxists, have made the demise of capitalism a goal in their quest for social justice. In doing so, they have forsaken capitalism's vast promise as an agent of social change and embraced dangerous alternate ideologies that contradict their stated purpose.
I've read enough leftist literature - from Karl Marx to Stokely Carmichael to Michael Moore - to realize that their central complaint is far from untrue. People in capitalist countries are, or at least have been, getting screwed.
Where these authors err, however, is in their assumption that the problems associated with capitalist societies are symptomatic of capitalism itself and that anti-capitalist ideologies are necessarily preferable.
One claim that has been made by both black authors such as Carmichael and white authors like Noam Chomsky is that racism is either an inevitable product of capitalism or is inseparable from it in result.
To bolster their assertion, these critics have pointed to instances of racial discrimination in capitalist countries as well as to wars waged by "white America" against "brown-skinned peoples" across the world.
Alas, these atrocities are not the byproduct of capitalism but of an unjust legal system.
Truth be told, capitalism is ideologically incompatible with racism. Racist business practices, such as hiring discrimination and redlining, lie in direct contradiction to capitalism because they diminish competition.
Race is a non-factor in capitalism; the only color that matters is green.
Similarly, the allegation that capitalism invites oppressive governments (such as the Pinochet regime in Chile) can be falsified by examining basic capitalist principles. Real capitalism requires a free market. A free market demands that individuals make economic decisions without government coercion, manipulation or control.
Furthermore, a capitalist society is one in which the government respects property rights and intervenes as little as possible.
Some of the worst economic aspects of the Pinochet regime, including its hand in money laundering and financially supporting private sector monopolies at the expense of the working poor, represent anti-capitalist (corporatist, think Enron) rather than capitalist thinking.
It is interesting to note that while capitalist societies bar oppressive government policies, anti-capitalist societies often encourage them.
In order to "even the playing field," socialist governments have enacted "land reforms" (a euphemism for forcibly seizing and redistributing property), nationalization of industries (by which a government forcibly places assets in state control) and other forms of blatant economic terrorism.
Even in mixed economies, government oppression is propagated in the form of punitive tax brackets needed to sustain a cumbersome public infrastructure (in other words, tax the rich and save Amtrak).
The oppressive nature of taxation and regulation is often overlooked by social justice advocates, however, because they have no sympathy for the victims (many of whom are wealthy). And yet conventional notions of justice say a wrong is a wrong regardless of who commits it and who is it is committed against.
Lastly, anti-capitalists have advanced the claim that capitalism is dehumanizing. Activists such as Moore have pointed to sizeable divisions between rich and poor, low wages and a diminished standard of living as being endemic of the American capitalist system.
He is at least partially correct. Capitalism does create class divisions and the need for rich and poor. What Moore does not mention, however, is that these classes are not fixed.
One can become rich and one can become poor. Being born into wealth might give a person an edge, but it is an edge he will soon lose if he lacks the skills to sustain it.
Conversely, as Stephen King and Bill Clinton have demonstrated, being born into poverty does not preclude the possibility of escaping it.
Ironically, while Marxists criticize capitalism for its class distinctions, Marxist societies have proven to be among the most classist in history.
Whereas capitalism at least provides the possibility for transition, Marxist class barriers tend to be far more rigid. There are the bureaucratic elites and their agents and then there is everyone else.
Just because Fidel Castro (to use a popular example) dresses like an average Cuban doesn't change the fact he has many times the wealth (net worth of $550 million according to Forbes magazine) and influence of an average Cuban.
Capitalism is hardly without its faults, but it is just inasmuch as it allows every individual the opportunity to pursue his or her economic destiny.
It does not, however, guarantee equality of result, a concept that is fundamentally unjust to begin with (see the article on Fred Feldman in last week's Signal).
Because they have falsely associated various faults and injustices with capitalism, the well-intentioned fools on the anti-capitalist left have long regarded it with cynicism and suspicion while at the same time holding an overly optimistic and idealized view of its alternatives.
Suppose, for an instant, that the two were flipped and social justice advocates worked toward Gult's Gulch - the golden meritocracy of Ayn Rand's "Atlas Shrugged" - instead of Marx's equally unrealistic classless society. If the energies spent into tearing down capitalism were diverted into building it up, many of its flaws could be worked out or overcome.
Capitalism is not the enemy, but rather a device that may be used to free men and women from racism, prejudice and government abuse. If anything, we need more of it in this country, not less. The sooner the well-intentioned fools realize that, the better off we'll all be.
(03/23/05 12:00pm)
It's been over a month since famed playwright Arthur Miller passed away and his eye for social commentary is already missed. A witch hunt, not unlike the one portrayed in "The Crucible," is taking place under our very noses.
Back in December, Alabama state Rep. Gerald Allen (R-Tuscaloosa) introduced a bill that would make it a crime to use state funds to purchase any book that has "positive depictions of homosexuality." His prescription for classics such as "Cat on a Hot Tin Roof" and "The Color Purple"? Dig a hole and dump them. His rationale? "Values are under attack."
Allen is right, of course. Our values are under attack ... from people such as himself. Not long after Allen's censorship bid, the harridans of talk radio lashed out against Clint Eastwood for making a film that even so much as mentioned euthanasia in anything other than a one-dimensionally negative light (while they do have a right to an opinion, Michael Medved, as a film critic, was duty-bound to examine the merits of a film, not the merits of any message contained therein).
In a nation that has always placed a premium on freedom and liberty, this kind of behavior should be alarming. Instead, it is becoming accepted as the norm.
It is no surprise that this increase in oppression closely corresponds to the political ascendancy of social conservatives. Social conservatives occupy positions of prominence in all three branches of government and have numerous operatives in grassroots organizations.
They have succeeded, in no small part, by conning supporters into believing they stand for "traditional values." A closer look, however, reveals that there is very little that is traditional or endearing about their methods and goals.
Offense or defense?
A common claim made by social conservatives is that their voices are not being heard. They point to a pervasive liberal bias in academia and insist they are seeking to merely level, rather than dominate, the playing field. Chanting the mantra of "academic freedom," they are able to rally sympathizers to their cause.
While it is true that a disproportionate number of college professors lean left, these claims are largely without merit. Following Allen's logic (if one can call it that), the bill he sponsored was designed to remove the pro-homosexual influence found in works such as "Angels in America."
But wait - the last I checked, I could go to a library and read the Bible or Ann Coulter's latest screed or "Mein Kampf." Or, I can elect to simply not read at all.
Right now, there exists a balance of ideas (even if the stewards of those ideas are biased). Works that cast homosexuality in both a positive and a negative light are available and we are free to experience either, neither or both.
What Allen wants is not balance, but monopoly. He isn't defending "traditional values" as much as he is attacking nontraditional ones.
It would appear then that in the warped mind of a social conservative, the two are one and the same. This "us or them" mentality, exemplified by Pat Buchanan's cry for a "culture war" in the mid-1990s, again has its bearings in fantasy rather than reality.
Consider the following example. I don't drink. As a nondrinker, am I allowed to argue that the drinking of others "threatens" my ability not to drink? Am I - though probably in the minority - then allowed to attempt to ban drinking on those grounds? I should certainly hope not! I, like the social conservatives, would essentially be punishing others for my own insecurity.
Indeed, social conservatives would be better off if they looked toward their economic counterparts and thought of ideas and values in terms of a market atmosphere. That which cannot survive healthy competition doesn't deserve to survive at all.
Grasping at straws
It goes without saying that social conservatives seek a moral sanction for their agenda. By codifying personal moral convictions (such as those that govern sexual behavior), they believe they will be able to inject morality into law. This belief ignores the fact that law is neither moral nor immoral, but rather amoral.
Laws are designed to protect individuals and their property, not create "good" or "bad" people. Those who are of a righteous disposition do not need laws to tell them what they should or should not be doing.
Similarly, those who are of a malevolent disposition will not let laws get in their way. Therefore, the concept of law as an agent of moral change or authority is flawed to say the least.
As many social conservatives are also lawmakers, they are undoubtedly acquainted with this fallacy. In order to overcome it, their rhetoric often takes on an alarmist tone. "We can't allow X to happen," the emboldened social conservative declares. "Our entire civilization will crumble if it does!"
This is, at its core, a slippery slope fallacy. In opposing gay marriage, for example, social conservatives have argued that incest and polygamy will follow despite a lack of a logical connection between the phenomena. The former does not naturally result in the latter.
By attempting to paint an apocalyptic view of the consequences of abortion, gay rights, relaxed drug laws, legalized prostitution and other contentious issues, social conservatives are playing upon our deepest fears and paranoia.
The fact of the matter is we do not - and cannot - know with real certainty what will come of many of these actions in the long term because the long term has not happened yet. Any attempt to suggest otherwise is merely grasping at straws.
At odds with the past
What's ironic about social conservatives is that they aren't half as in sync with tradition as they believe themselves to be. Rugged individualism, not social cohesion, has been our greatest historical asset.
Many of our most renowned leaders engaged in behavior that would nowadays be seen as either morally unchaste or socially unacceptable.
From Benjamin Franklin's opium use to Thomas Jefferson's indiscretions with slave girls, it is evident that these men valued privacy and liberty. The effort to assault either in the name of tradition is an affront to their legacy.
Even more ironic is the extent to which social conservatives are out of touch with the history of their own movement. Barry Goldwater, one of the forbearers of the modern conservative movement - a man who attracted socially conservative white voters en masse - was pro-choice and supported the right of gays to serve in the military.
The late Ronald Reagan, a god among conservatives who was renowned for his traditional values, had the following to say against marijuana prohibition: "If adults want to take such chances, that is their business."
How can it be that modern social conservatives are so out of touch with their idols? The answer is that Reagan and Goldwater were political conservatives first and social second. Political conservatives value small government. Social conservatives value an increased government role in promoting social cohesion. Therein lies the schism.
Borrowing from the enemy
For all the griping social conservatives do about progressives and leftists, there are remarkable similarities between them. Eastwood, a libertarian Republican, made light of this in response to his most recent criticism. "When you go far enough to the right you meet the same idiots coming around from the left," he said in a Time interview.
Whether they are cognizant of it or not, social conservatives have essentially embraced the leftist concept of political correctness and made it their own.
They have censored that which is critical of traditional Christian values with the same zeal progressives have demonstrated in blocking critiques of egalitarianism.
Furthermore, social conservatives have placed a cancerous burden upon the governmental system. Just as the American Civil Liberties Union (ACLU) has taken heat for launching frivolous lawsuits, the Parents Television Council (PTC) should be lambasted for making ridiculous complaints.
According to the FCC, 99.8 percent of the nearly 240,000 complaints the agency received in 2003 were courtesy of the PTC.
Perhaps it can best be said that social conservatives are to society what socialists are to the economy. They believe that more government regulation is better, individual rights should be sacrificed for the abstract notion of a greater good and anyone who disagrees can only be evil or stupid (if not both).
When stripped of their subterfuge, it is clear that what social conservatives are advocating is not preservation and equality, but a vicious power grab.
Think about it: in a liberal/libertarian framework, any one person can still be a social conservative. Individuals can abstain from sex, not have abortions, believe homosexuality is wrong, refrain from drug use, practice Christianity and salute the flag with pride. Their rights are preserved even if they disagree with the system that preserves them.
A socially conservative framework, however, would not extend the same freedoms to those who disagree and would, in true tyrannical fashion, punish them for daring to be divergent.
President Bush has often spoken of establishing Iraq as "a beacon of freedom." This is not going to happen if freedom is ignored on the home front.
We must acknowledge and respect our differences rather than try to force conformity, stand up to fear and intimidation and stop the gradual Talibanization of America before it becomes any more unbearable.
Footnote
For the purposes of this article, "social conservatives" are persons and groups who advocate a socially conservative political agenda. The piece is not meant to criticize individuals who merely happen to have socially conservative views or those views themselves.
(03/16/05 12:00pm)
As acerbic comedian Lewis Black famously noted, we are faced today with two dominant forces in politics: the "party of bad ideas" and the "party of no ideas." Competition between the two - especially with regard to the recent Social Security debate - has not brought out the best in each, but rather the worst. As a result, our nation's future is in peril.
Throughout the Bush administration, the Republican Party has proudly taken up the mantle as the party of bad ideas. As I noted in a previous column, the knock on Bush isn't so much that his initiatives are ill-conceived; it is that the mechanics behind those initiatives are disturbingly faulty.
Social Security reform is no exception to this downward trend of futility. It is true that Social Security will soon be facing a funding crisis. It is also true that, in the long run, the American people are probably better off managing their own money in private accounts.
These two truths, however, are no reason to jump for joy over the Bush proposal. As his opponents have noted, the short-term cost of transitioning to these accounts would be devastating.
Inasmuch as we are already saddled with enormous deficits, the timing could not have been worse.
Furthermore, there are some legitimate concerns about the long-term solvency of the Bush plan.
Given that our economy swings like a pendulum and our stock market has been known to nosedive, many Americans are predictably hesitant to let go of their pecious Social Security safety blanket.
Theoretically, these gaping holes in the Bush policy should leave the Democrats in a good position to come through with a plan of their own. True to its status as the party of no ideas, however, the Democrats have been unwilling or unable to step up.
Their failure to take action is made even more alarming by historical precedent. Contrast the Social Security crises to the welfare headaches of yesteryear.
Republicans had long been looking to axe the program, but centrist Democrats got out ahead of them. They made substantial cuts and changes without scrapping the program entirely and were able to take the bulk of the credit.
It was a risky move given the party's big-government, handout-friendly base, but the Democratic Leadership Council took a gamble and it paid off.
Unfortunately, Democrats seem unwilling to take such a gamble again.
Whereas Vice President Cheney has invited a bipartisan solution, Democratic leaders are refusing to even negotiate unless privatization is abandoned.
Sen. Ted Kennedy (D-Mass.) has made it clear no dialogue will take place unless the president backs off his threat to "kill" Social Security.
Perhaps this is an indication that the Democratic Party truly has drifted hopelessly leftward.
After all, a decade ago many of the privatization plans came from Democrats themselves. Sen. Bob Kerrey (D-Neb.) championed tax-deferred KidSave accounts, an idea which had broad bipartisan support.
Later in his second term, President Clinton emerged as an advocate for flawed-but-noble Universal Savings Accounts.
Even now, a select few Democrats - such as Gov. Ed Rendell (Penn.) - have broken with party ranks to endorse the Bush plan.
If Democrats had an ounce of intelligence, they would be endorsing it in droves. Agreeing to the idea in principle would bring them into the debate. The specifics could then be mulled over until a compromise is reached.
Furthermore, participation would do a lot to remove the "obstructionist" label with which Republicans have been quick to stick them.
Given that Republicans have the upper hand in Congress, it is not likely that the Bush plan will see much alteration. If passed, we will be stuck with it, costs and all. And yet, a flawed plan is still preferable to no plan at all.
Unless the Democrats come to their senses, one of two things is going to happen. Either the Republicans will get the plan passed over their objections and their minority status will be even further diminished or they will succeeded in blocking it, to the detriment of America.
The "me-first" partisanship and shortsighted buffoonery that Democrats have embraced as their credo might give entertainers like Black great material, but the only thing it is likely to give the rest of us is headaches 20 years from now when Social Security benefits run out.
(03/16/05 12:00pm)
Politics isn't a matter of making friends, but rather picking which enemies you can stomach the least.
Case in point: I've come to detest much of the modern radical feminist lobby. Many of its members claim victimization at every turn and constant carping about "the patriarchal hegemony" is often little more than convenient cover for gratuitous male-bashing.
Despite these repugnant characteristics, the feministas have a useful role to play. They keep the anti-abortion fanatics at bay. In the grand scheme of things, it is the latter group - not the former - that is the greatest threat to individual liberty and governmental restraint.
Perhaps the greatest fraud perpetrated by abortion foes is the claim that they act in defense of life.
According to the Center for Disease Control and Prevention, there were 525 pregnancy-related deaths reported in 1999. This figure is likely to increase substantially if "pro-lifers" get their way and abortion is criminalized.
Think they give a damn about those lives lost? Think again.
Indeed, abortion opponents have not earned the right to be called "pro-life."
True pro-lifers, such as the Dalai Lama, actor Martin Sheen and columnist Nat Hentoff, don't merely oppose abortion, but euthanasia, war and capital punishment as well.
Inasmuch as certain abortion opponents have been known to favor one or more of these other elements, their "defense of life" argument can be deemed shallow at best, hypocritical and morally subjective at worst.
Just as opposition to abortion and pro-life are not synonymous, being pro-choice does not necessarily mean being pro-abortion.
There are many who feel that abortion - as a matter of personal opinion - is wrong, but they are not content to sanction an invasive and Draconian government ban on the procedure.
Pioneering feminist/anarchist Emma Goldman, for one, characterized the high rate of abortion as "appalling," but she defended the right of women to have abortions nonetheless.
There is no contradiction here. Quite simply, two wrongs do not make a right.
To draw a parallel, there are plenty of people (such as the aforementioned Hentoff) who will defend free speech to the hilt, even if they disagree with what is being said, because they view censorship as a wrong in and of itself.
The same principle applies to abortion: pro-choicers may find it disagreeable, but no more disagreeable than legislation that aims to control a woman's body.
Abortion opponents either fail to recognize or deliberately downplay this important distinction. In their warped and demented view, there are only two sides: those who oppose abortion and are hence "pro-life" and those who don't oppose abortion and are therefore "anti-life."
By dumbing down a complex issue and erecting a false dichotomy, they are able to impose a dubious moral imperative that favors their cause.
As we shall soon see, however, the tactics employed by the anti-abortion lobby are anything but moral and semantic oversimplification is the least of their many transgressions.
And so it was written
In a desperate attempt to elevate the validity of their argument, abortion opponents would like us all to believe that God is on their side. Appealing to a higher power is, of course, a stock rhetorical fallacy. Nevertheless, for religious folk, the "God condemns abortion" argument is a weighty one.
Unfortunately, it is also a false conclusion ill supported by either scripture or subsequent teachings. The Bible lays forth several points at which life may begin, none of which are at the point of conception. Psalm 139:13 refers to life beginning in the womb, but it does not say that life begins with pregnancy itself.
Those who believe life begins at the point of viability can feel comfortable knowing the Bible is on their side, as a viable fetus is still "in the womb" as the Psalm says.
To complicate matters further, other parts of the Bible lay down different standards for personhood. In Genesis 2:23, Adam only becomes "a living being" when God breathes life into his nostrils. Therefore, any pre-birth organism that has not yet developed nostrils cannot be considered alive.
Another standard is laid down by Leviticus 17:11, which says, "the life of the flesh is in the blood." Ergo, until a circulatory system is developed, a being is not alive in the Judeo-Christian sense of the word.
Subsequent writings also helped establish an oft-ignored pro-choice position in religion. The Jewish belief has always been that life begins at birth and rabbinical texts have affirmed this.
Early Christian writers, such as Thomas Aquinas, St. Augustine and Pope Innocent III, believed that life began when a fetus became "animated." This view closely corresponds to the modern concept of viability and casts serious doubt on the notion that the pro-choice position is incompatible with faith.
Those who cling desperately to a "God forbids" argument are not only being dishonest, but they are also being dishonest while using the Lord's name for cover.
Back in the day
Another deceptive device employed by the anti-abortion lobby is the idea that opposition to abortion is the historically traditional position and pro-choice legislation represented a radical departure from it.
This ruse was pulled off so convincingly that I myself fell victim to it before I realized the weight of historical evidence simply does not bear it out.
Our legal system is based on the English Common Law that dates back hundreds of years. The Common Law held that an "unquickened" child was not a person and its destruction was not equivocal to murder.
When our Founding Fathers set up shop, they most likely agreed with this view, for they put nothing in the Constitution to the contrary.
Legalized abortion was in fact a part of our country's heritage until the late 19th century, at which point the greedy medical lobby finally succeeded in pressuring lawmakers to impose restrictions.
To their credit, some abortion opponents do realize they are flying in the face of history and tradition.
To justify their radicalism, they paint themselves as the ideological heirs of radicals of a previous era: the anti-slavery abolitionists. The attempt to equate abortion to slavery has been used many a time by anti-abortion advocates to demonize their opponents.
During the Illinois Senate race, Alan Keyes insisted Barack Obama held "the slave-holder's position" on abortion, proving that conservatives are every bit as adept at pandering and playing the race card as the liberals they despise.
Keyes' argument, however, ignores one of the key tenets of abolition - every person has a right to his or her own body. Slavery was wrong inasmuch as it deprived individuals of that right and placed control in the hands of the slave masters.
Likewise, the anti-abortion movement seeks to transfer control from the individual to the government.
If anything, Keyes is closer to holding the slave owner's position than is the man who defeated him.
The chicken and the egg
Without a doubt, the most aggravating aspect of abortion opponents is their ability to recognize the futility of their argument and still refuse to concede the point.
A popular pro-abortion argument is that a pre-viability zygote/embryo/fetus is not a living, breathing person, much in the same way an egg is not a chicken.
Abortion opponents grasp this important distinction. They just prefer to pretend it does not exist.
Sometimes, however, they let their guard down. In the dissenting opinion of Planned Parenthood v. Casey, Chief Justice Rehnquist wrote, "Abortion involves the purposeful determination of potential life."
That he used "potential life" instead of "life" itself shows that he is cognizant of the difference.
Sen. Orrin Hatch (R-Utah), an ardent abortion opponent, furthers this point in his defense of stem cell research. "I just cannot equate a child living in the womb, with moving toes and fingers and a beating heart, with a frozen embryo sitting in a lab somewhere," he said.
If he really bought the old anti-abortion line about life beginning at conception, he would have no choice but to believe that that frozen embryo is a person, no different than you or I.
Hatch, however, sees a distinction. Despite this, he and others have stuck by their tired "abortion is murder" rhetoric even though legal, historical, biblical and scientific evidence says otherwise.
These are but a few weapons in the arsenal of lies, distortions, mistruths and half-truths used by abortion opponents to further the cause of criminalizing abortion.
The irony here is that those who oppose abortion and bemoan the loss of life should be working to keep it legal.
Criminalizing abortion will not stop it from happening, but it will cause women to seek crude and dangerous procedures.
In third world countries where abortions are criminalized, these procedures have resulted in the deaths of thousands of pregnant women per year, with high infant mortality rates to match.
At the heart of this matter is an ethical tug of war. Ethics requires us to make informed decisions that take into consideration the needs of all parties involved and the potential consequences of every action.
Abortion opponents detest ethics. Rather than subject their rhetoric to this kind of careful evaluation, they want the answer to the abortion question in every instance to be a universal, unflinching "no!"
Footnote
I'm hardly what one would call a zealot on this issue. I believe Roe v. Wade was wrongly decided and I support a number of state-level abortion restrictions. It took the overwhelming deceit and hypocrisy of the anti-abortion movement to prompt me to write this piece. If abortion opponents are offended by it, they have only themselves to blame.
(03/02/05 12:00pm)
"The first casualty when war comes is truth." These words were famously coined by Hiram Johnson, a progressive Republican, more than 80 years ago. They have since been restated as "the first casualty of war is truth" and used by Democrats, Socialists and all others who oppose combat. The distortion, although a minor one, only goes to prove Johnson's point. When at war, adherence to objective factuality goes right out the proverbial window.
To attempt to make a case for or against the war in Iraq at this point would be futile. Not only are most peoples' minds made up, but as the war is on its downswing, declarations of support or opposition would likely seem too much like Monday morning quarterbacking.
The recent Iraqi elections, however, ensure that it will remain a contentious issue for some time to come. Defenders of the war view the election as the triumph of democracy and a crowning achievement of American effort, while detractors see it as a prelude to the rise of radical Islamism in an already fractured country.
Either way, the carousel of lies and hyperbole readies itself to spin around again.
The myths of the anti-war lobby are easier to pick apart, as they are more facially void than those of the pro-war brigade.
Let's start by asking a few questions of the peaceniks and protestors. If this is a war for oil, why are gas prices so high? If the United States acted "unilaterally," why did more than 30 nations join them in sending troops? If this is a war against Islam, why do we have Islamic allies and why did we appoint and Islamic interim prime minister?
Of course, not all anti-war rhetoric is as pathetically easy to dismiss. One of the more incisive points made by the anti-war crowd is that the Bush administration has acted inconsistently in invading Iraq to assuage concerns about brutal dictators and weapons of mass destruction (WMD) while failing to invade North Korea or Uzbekistan for similar reasons.
Indeed, this brings to light a problematic application of the Bush Doctrine, but it is not a case against war in and of itself. Saddam Hussein was still a tyrant, even if he wasn't the only tyrant.
Many of these points have been posited by the pro-war crowd in a snickering, self-righteous fashion that overlooks the fact that war supporters have been no more truthful or forthright in their conduct.
I shall overlook the obvious - nonexistent WMD and phony al-Qaeda connections - in favor of some of the more subversive deceptions.
First and foremost, the pro-war crowd would have us believe that all of its mistakes were honest ones and that the faulty intelligence the Bush administration relied on was the same faulty intelligence everyone else relied on.
This is an overly naive assumption in that it ignores the fact that Bush had regular access to former CIA director George Tenet and the National Security Council whereas those who weren't in the upper echelons of government did not.
In lieu of insider information, members of Congress were given the National Intelligence Estimate, a lengthy, confusing and often contradictory report.
To pretend that they were every bit as informed as the president is disingenuous at best.
Next, war supporters have attempted to discredit their opponents by claiming that the war has been opposed for entirely political motives. While it is easy to think of a partisan hack like Ted Kennedy criticizing the war to score points, the fact remains that this is not a left-right issue.
Prominent war supporters include Democratic presidential nominees Dick Gephardt and Joseph Lieberman and prominent leftist pundit Christopher Hitchens.
The war's opponents include Rep. Ron Paul (R-Tex.), archconservative commentator Pat Buchanan and neoconservative political economist Francis Fukuyama. Other Republicans, such as Sens. McCain, Hagel and Lugar and former National Security Advisor Brent Scowcroft, have been highly critical of the war effort to date.
Finally, the pro-war lobby has been deceptive in its portrayal of the war in Iraq as a boon to the war on terror. CIA director Porter Goss recently admitted the Iraq conflict, "has become a cause for extremists. ... Those jihadists who survive will leave Iraq experienced and focused on acts of urban terrorism." Of course, when war opponents made the same points months ago, they were denounced as appeasers.
The sad truth in all of this is the extent to which each side demands accountability from the other while refusing to admit its own culpability in spreading lies and propaganda.
For me, this war has been horribly bungled - perhaps unforgivably so - by the Bush administration, but it is still justified by whatever good (a quasi-Democratic, Saddam-free Iraq) can come out of it. I hold no delusions about it being a matter of ultimate good or ultimate evil. The sooner the myths of war are abandoned in favor of truths, the sooner we will be able to gauge our success or failure.
(02/23/05 12:00pm)
We are a society that loves apostasy (the act of leaving one's faith or party). When Denzel Washington stopped playing do-gooders and took up the mantle of corrupt cop Alonzo Harris in "Training Day," he won an Oscar.
When Georgia Democrat Zell Miller, a one-time client of Paul Begala and James Carville, spoke at the Republican National Convention in support of President Bush, conservatives cheered. And when former Republican strategist Kevin Phillips wrote a book blasting the Bush administration, Democrats couldn't be happier. Yet beneath all this cheerleading there is - or at least should be - healthy skepticism that selling out isn't all it's cracked up to be.
A commonly held misconception is that an apostate, by way of an enlightening or life-altering experience, moves from a "bad" ideology to a "good" one. A more accurate assessment is that apostates move from bad ideologies to ideologies that, while radically different, might be every bit as bad as the ones they abandoned.
Let us not forget that Benito Mussolini was a former Communist who developed his model for Fascism out of a rejection of Marxism.
There is also considerable evidence that apostasy exists solely for the sake of self-gratification. A man who reaps the benefits of youthful indiscretion (namely, marathon feats of sexual activity and alcoholic imbuement) and then denounces hedonism after "finding God" is able to have the best of both worlds. He gets to have his fun and still appear holy by preventing others from having theirs.
Apostasy often entails an evasion of personal responsibility. Rather than merely saying "I was wrong," an apostate is likely to say his entire former movement was wrong and, as such, he was corrupted by it.
This allows the apostate to dodge accountability on his part while simultaneously painting a large group of people with a broad and often inaccurate brush.
Lastly, apostasy overlooks the fact that it takes a deft combination of naivet? and poor judgment to embrace a "wrong" movement in the first place. Given that many apostates are former members of extremist factions, we should be wary of their present and future affiliations.
Alas, the shortcomings of apostasy are far more than theoretical. St. Augustine is a prime example, having moved from a position of indulgence to one of extreme intolerance.
So great was his desire to purge himself of his lustful adolescence that he wound up denouncing all sexual pleasure, advocating chastity over even intramarital sex and laying the groundwork for clerical celibacy in the Catholic Church.
Upon converting to Christianity, he felt the need to repudiate his previous non-Christian beliefs and did so by attacking Judaism (a number of his writings are deeply anti-Semitic).
The effects of Augustine's overreaching are felt to this day, as the celibacy doctrine is a contributing factor to the abuse of young boys by Catholic priests and some Jews and Christians still regard one another with marked suspicion.
Apostasy also manifests itself in neoconservativism. Irving Kristol, the movement's founding figure and a proud former Trotskyite, defines a neoconservative as "a liberal who got mugged by reality."
Indeed, many neoconservative operatives, from Richard Perle to Jeanne Kirkpatrick, are former leftists.
In Kristol's case, using his experiences in the Fourth International to attack Democrats is a facetious proposition, akin to using criticisms of David Duke or Joseph McCarthy to brand all Republicans.
Furthermore, going from extreme naivet? in foreign policy matters to extreme aggression isn't waking up - it's merely going to sleep on the other side of the bed.
The paleoconservative author Justin Raimondo believes that neocons aren't apostates at all. He speculates that they are sticking to the same Trotskyite game plan, substituting the establishment of global, pro-Western puppet "democracies" for global socialist uprisings.
Whether he is correct or not is a point of contention, but there is no mistaking the clear - to say nothing of ironic - link between neoconservatives and the radical left.
The case of the two Davids illustrates that old habits held by apostates sometimes die hard.
As one of the founding figures of the New Left, David Horowitz edited Ramparts, an early '60s antiwar magazine that made ridiculous claims and advocated conspiracy theories (most of which can be boiled down to paranoia regarding the CIA).
After renouncing his Marxist leanings and embracing conservativism, Horowitz took up writing for NewsMax, a rightwing Web site that makes ridiculous claims and advocates conspiracy theories (most of which can be boiled down to paranoia regarding Bill Clinton).
In the early 1990s, David Brock became famous for penning hatchet jobs on prominent liberals, including Anita Hill and Hillary Clinton. Though he later recanted and joined the leftist media, he never abandoned his sleazy tactics.
The only difference is his targets are now prominent conservatives, including Horowitz among others.
The central flaw of apostasy is its tendency to throw the baby out along with the bathwater.
More times than not, an apostate rejects his former creed wholesale, without bothering to separate the good from the bad.
Those who make more gradual, well-reasoned transformations and/or avoid making "wrong" affiliations in the first place are less likely to become duped by ideological snake oil salesmen.
When I made the switch from progressivism to libertarianism, I kept the useful parts (respect for personal freedoms), while jettisoning the weaknesses (reliance on ineffective bureaucracy to solve problems).
Then again, if more people made these kinds of sound, rational decisions, Michael Savage and Arianna Huffington would be unemployed, Tom DeLay and George W. Bush would be bar hopping in Texas and a generation of angry ex-hippies would still be medicating itself with false hope and peyote.
What a world that would be.
(02/16/05 12:00pm)
We have a saying around these parts: if you are going to come late, don't come at all. After months of standing behind a Republican-controlled Congress that has given us record deficits, President Bush has finally expressed his desire to cut spending. While this is a step in the right direction, it can only be described as too little, too late.
Bush's budget proposal for fiscal year 2006 tops out at a whopping $2.57 trillion, with a deficit projection of $390 billion. Neither of these estimates take into account a Social Security overhaul or spending in Iraq and Afghanistan.
All told, the amount of taxpayer money the federal government will spend is enough to make even Bill Gates cry.
Despite this, the proposal has come under fire, not for being too costly but for not being costly enough. It features deep cuts in environmental protection, agriculture and housing and urban development.
While one might expect that Democrats would be peeved by cuts to these big government programs, key Republicans have shown their true colors by griping as well. Sen. Saxby Chambliss (R-Ga) is miffed at cuts to farm subsidies and House leaders have indicated they will work up a proposal of their own (translation: they aim to spend more). So much for "the party of fiscal responsibility."
The flaws of the Bush budget extend far beyond how much is spent and how much is cut. Where the money goes or doesn't go is an issue of some contention as well.
Fiscal experts as diverse as New York Times columnist Paul Krugman and the Cato Institute's Stephen Slivinski are in agreement that the domestic spending cuts make up a relatively small portion of the total budget. Small as they may be, those cuts are likely to affect people who rely on agencies whose budgets are slashed.
Still, it can be argued that these cuts are necessary to reduce waste and fight pork barrel spending. But if that is the case, this philosophy should be spread across the governmental spectrum.
Spending on defense, homeland security and foreign aid is expected to increase substantially, with few objections to how the money is being spent.
It isn't difficult to see what is going on here - Bush is attempting to replace bloated, ineffective social welfare with bloated, ineffective military welfare that is far more expensive than its domestic counterpart.
Rather than making a serious effort to reign in spending, he is sticking a Band-Aid on the problem while letting the wound fester beneath it.
Asserting true fiscal responsibility requires looking beyond pet programs and political ideology and addressing cold, hard costs. While he is hardly alone in his inadequacy, this has been something Bush has been loath to do.
His faith-based initiatives, for instance, eat up more than $1.17 billion. In lieu of curing poverty, they transform religious institutions into pressure groups that ask for more federal spending. Subsidizing religion is not a legitimate function of our government nor should it become one.
Ending the war on drugs would also contribute greatly to deficit relief. Like Lyndon Johnson's War on Poverty, the War on Drugs has been a costly failure. Upwards of $11.2 billion a year is spent on what is essentially a health issue. Decriminalization of controlled substances would eliminate the costs of investigating, incarcerating and treating drug users.
Of course, it would bring about inane grumbling about "moral decay" from the Religious Right, but since when is their seal of approval worth $11 billion?
Cutting defense spending, while necessary, presents a greater challenge in the sense that no one wants to leave soldiers unprotected in times of war.
A lot of money earmarked for defense, however, does not go to protecting our soldiers. It goes to building shipyards for ships the Department of Defense never even requested - a half-a-billion-dollar expenditure, courtesy of Sen. Trent Lott (R-Miss) and his buddies in the shipping industry. It goes to building a multibillion dollar National Missile Defense system that many scientists are convinced won't even work.
It goes to fulfilling handshake deals with shady contractors and capitalizing on American fears to turn "security" into a cash cow.
Scaling back farm subsidies, one of the budget proposal's few strengths, is likewise an unpopular measure.
Inevitably, it beckons the image of a poor farmer struggling to make ends meet. Poor farmers, however, are not the ones who benefit most from farm subsidies.
Instead, subsidies are granted to large agribusiness firms that help keep poor farmers poor and consumer prices high. The government should stop doling out millions to these pig-raising pigs and force them to compete on the open market.
These cuts can save billions of dollars a year and put us on the path to a balanced budget. But because Republicans lack the temerity to stand up to self-serving moralists and deep-pocketed lobbyists, they are unlikely to see the light of day.
Whereas Democrats are at least open in their support of big government, Republicans will protest it while simultaneously doing everything they can to enable it. As the Bush budget proposal goes to show, the statist, spendaholic G.O.P. is as soft as the cotton it subsidizes.
(02/16/05 12:00pm)
During a conversation a few months back, a friend sent me a link to a Washington Times editorial in an effort to prove me wrong with "solid facts."
I was alarmed by this and not just because the Washington Times is a patently dishonest, ideologically-driven excuse for a publication.
No, it alarmed me that my friend could not distinguish between news and propaganda.
Sadly, this probably speaks to a larger trend in America. Depending on our own ideas and preferences, we are likely to accept or reject the information we receive with little or no critical evaluation.
The result is an ignorant, uninformed populace prone to making stupid decisions (elections being a prime example).
Fortunately, the path to knowledge is not blocked by fallen trees or impassible boulders, but only by the laziness of the individual who lacks the motivation to know.
The first step down this path is to consider where information comes from. Be on the lookout for conflicts of interest and be aware that many organizations will toot their own horns.
If Citizens for a Quail-Free America gives you some disturbing "facts" about quails, you would do best to see what others have to say on the matter.
Alas, even reliable sources do not always provide reliable content. To overcome this potential pitfall, one should always be on the lookout for certain ominous words or phrases.
A good reporter will be specific. He or she will name names, share numbers and give you everything you need to know.
A propagandist, on the other hand, will be vague.
You'll often hear a propagandist tell you "sources say" that "many reasons" exist for such and such.
At this point, you should be asking yourself "which sources" and "what reasons."
Often, something as simple as the choice of one word can change how we feel about a story.
A good reporter will opt for neutral terminology. When one side of a conflict sees a group as "terrorists" and another side sees the group as "freedom fighters," it is the reporter's responsibility not to show a preference for either side.
A propagandist, on the other hand, has no such qualms and will opt for whatever language best suits his cause.
Presumably, everyone reading this has taken some kind of rhetoric or academic writing class (but, if you are like me, you don't remember it) and was taught a list of logical and rhetorical fallacies.
If you committed these to memory, you are ahead of the curve.
Propagandists love to make use of the slippery slope ("If we allow this to happen, it will lead to crime/death/war/the apocalypse"), the ad hominem attack ("Ed Masposito says all liberals are gay") and the confusion of correlation with causation ("Because a lot of black people are in prison, the justice system must be racist").
When all else fails, they will evoke Orwell, the Bible, Nazism, Communism, McCarthyism or any other potentially debilitating -ism that has stained humanity since its existence.
To be certain, propaganda is not without its usefulness (if it was, this entire section would be defunct and I would be out of a very low-paying job).
Propagandists have the ability to amuse, inspire and enrage. They give strong voices to oft-ignored issues and allow every yokel to have his day.
Propagandists, however, should not be mistaken for journalists. The purpose of journalism is to inform.
A good journalist doesn't want to make you his friend or his enemy. He doesn't care how you feel about what he tells you, he just wants you to know about it.
A propagandist, on the other hand, aims to persuade. "The facts" are a boon when they further the cause of persuasion and a meddlesome barrier when they undercut it.
If you have been paying attention, you'll realize by now that this is a piece of propaganda about propaganda, as I am persuading all of you to inform yourselves.
And if you don't, the Nazis will come for your ass.
(02/09/05 12:00pm)
It's been a longstanding tradition of mine to write something nasty about Valentine's Day every February. I've targeted its excessive commercialism, its bastardization of history (it was once a religious celebration after all) and even conventional notions of romantic love itself. This year, I'm going to try something different. I am going to be nice.
Valentine's Day is laudable, if for no other reason than it brings out the cynics in many of us. We single folk especially tire of the pervasive - not to mention sappy - sentimentality and the endless giving of candy, cards and flowers.
We snicker as love struck fools empty their pockets and we salivate at the amount of money companies such as Hallmark and Russell Stover must make. In short, we (the single, the jilted, the lonely) really put a damper on things and that alone is cause for celebration.
As a year-round cynic, it troubles me that the virtues of cynicism are often overlooked. Its destructive aspects are harped on incessantly, but little is said about its ability to inspire, create and promote.
The truth is that we owe much of our understanding of the world to "negative thinkers," such as Sigmund Freud and Arthur Schopenhauer. Even when ultimately disproved, their ideas challenged conventional wisdom and opened the door for further inquiry and exploration.
Perhaps the biggest virtue of cynicism is its ability to overcome disappointment.
As cynics have little to no faith in humanity, they expect things to constantly go wrong. When this does happen, they are not devastated by it because their expectations were so low from the start. In the event something goes right for a change, the cynic is pleasantly surprised.
Cynicism also promotes determination and good work habits. Whereas optimists expect to succeed by default, negative thinkers expect not to succeed. As a result, they must labor twice as hard to overcome those expectations and achieve their goals. This ensures that cynics always give their best possible effort and never become complacent.
Lastly, cynicism allows for greater knowledge and personal freedom. Because cynics are doubtful of human sincerity, they tend to examine ideas and institutions with a higher degree of skepticism than most. This enables them to see things that more trusting members of society usually miss.
Furthermore, inasmuch as cynics are doubters and naysayers, they are virtually immune from the pitfalls of peer pressure and groupthink. If cynics were running the country, most of the bad policies of the past century would have never come into being because said ideas would have been dismissed beforehand.
Harkening back to Valentine's Day, even those who doubt its sincerity and tire of its faux rosy specter may find themselves hesitant to denounce it. After all, it is supposed to be a joyous occasion, even when there is no real joy to be felt.
Those who are hesitant may find all the courage they need by simply embracing their inner cynics. Denounce, decry and defame. It is your right to find fault with anything you wish and let no one tell you otherwise.
(02/02/05 12:00pm)
As an English-oriented person, I become annoyed when words are used improperly. I feel like smacking anyone who says "more unique" or "better then you" - it's "than," people!
Similarly, I'm dismayed by the gargantuan number of misnomers that are put into play in politics. John Kerry has been called "the most liberal member of the Senate."
Talk show gadfly Dennis Prager has made a career out of misrepresenting leftists in his "Are You a Liberal?" column.
Many of President Bush's critics are quick to label him "a right wing extremist." The one thing they all seem to have in common is an ignorance of the terminology that they espouse.
Beneath all the exaggerations and straw man assessments, political labels have very concrete meanings.
Stripped of all its connotations, a conservative is simply one who seeks to protect the status quo. Conservatives defend existing traditions, values, institutions and norms and oppose change that threatens them.
In contrast to this, progressives (often mistakenly referred to as liberals) seek change and constant improvement. This change may come at the expense of existing traditions, values, institutions and norms.
At its ideological core, the battle between right and left is nothing more than a conflict between old and new.
Liberalism adds another dimension to the conflict. A liberal, as the name implies, is one who values individual liberty.
Liberals do not view either change or traditions as ends in and of themselves, but rather the means of meeting the aims of personal and economic freedom.
Liberalism is opposed by authoritarianism, which exists in both rightist (Fascism) and leftist (Communism) incarnations. In this sense, true liberalism (aka classical liberalism or libertarianism) falls outside the left-right spectrum.
Another point often missed by quick-to-label commentators is that these definitions are relative to place and time. We like to think of Republicans as being rightist and Democrats as being leftist, but this is true only in America.
Jacques Chirac, a conservative by French standards, has advocated pro-social economic policies that would put him to the left of many Democrats.
Similarly, while Hamid Karzai is on the left side of Afghani politics, his pro-corporate, faith-based views place him on the American right.
Perhaps the most egregious fraud perpetrated by the ignorant are the various attempts by members of one side to take credit for that side's historical achievements. This assumes that political labels are continuous, that an 1860s progressive shares the same views and ideas as a 1960s progressive.
One needn't take more than a cursory glance at American history to realize this is a false assumption.
For example, at the time of the American Revolution, obedience to the British Crown was the status quo and colonial independence represented a radical change. Our Founding Fathers were the progressives of their day and the Loyalists were that era's conservatives.
At the time of the Civil War, progressives sought the abolition of slavery whereas conservatives aimed to preserve the plantation system and the Southern way of life.
Modern conservatives, however, are proud to have African Americans such as Condoleeza Rice as their leaders - something that would have distressed their ideological predecessors to no end.
Similarly, modern progressives generally seem to favor high taxation of the wealthy, an idea that would have alarmed the likes of James Madison. Procedural links between conservatives and liberals throughout the ages are virtually nonexistent.
With this in mind, it is easy to see how faulty a lot of today's political labeling really is.
Given the true meaning of the word, John Kerry can hardly be called a liberal, let alone "the most liberal member of the Senate."
Kerry, who voted to overhaul welfare and supports the death penalty for terrorists, can't even be considered the most progressive member of the Senate (a dubious distinction which would more likely befall Barbara Boxer or Hillary Clinton).
Those who would dub Bush a "rightwing extremist" would also do well to rethink their choice of words. Bush's "compassionate conservatism" is a buzzword for increased government spending, which flies in the face of fiscal conservatism.
Correspondingly, Bush's policy of spreading democracy abroad borrows from Woodrow Wilson's progressive idealism, while his use of an international coalition to fight terrorism is a paraphrase of John F. Kennedy's Alliance for Progress against Communism.
Not only is Bush not an extremist; by strictly conservative standards, he's also a colossal disappointment.
I raise these points regarding the misuse of labels because the practice not only offends my semantic sensibility, but my political sensibility as well.
We should all endeavor to define ourselves and not let ourselves be defined by our most ardent opponents.
Conservatives should not be content to let themselves be called racists or religious nuts nor should progressives allow themselves to be pigeonholed as Communists or anti-Americans.
Lastly, I long for the day when I will be able to say that I am a liberal (in the classical rather than the modern sense) and have people think Thomas Jefferson and not Walter freakin' Mondale.
(01/26/05 12:00pm)
In the underrated film "Cop Land," Sheriff Freddy Heflin is reluctant to turn his benefactor, crooked cop Ray Donlan, over to Moe Tilden of Internal Affairs. Heflin eventually has a crisis of conscience and changes his mind, but by that point the case against Donlan has disintegrated. "I gave you a chance to be a cop," Tilden admonishes, "and you blew it!" Were Congressional Republicans to come to their senses about Rep. Tom DeLay (R-Texas), I would offer a strikingly similar objurgation.
DeLay, the boisterous House Majority Leader nicknamed "The Hammer," has been a magnet for controversy for quite some time now.
Known for his extreme partisanship and hawkish views, he has already been reprimanded several times by House Committee on Standards of Official Conduct for offenses ranging from the attempted bribery of a colleague to accepting illegal donations on behalf of an energy company. He also has prominent ties to Texans for a Republican Majority, a group whose members have been indicted on charges of money laundering and accepting illegal campaign contributions.
All he seems to be missing is a PayPal account for the convenient collection of bribes and a tattoo that reads "Property of the Oil Industry."
While this behavior, in and of itself, is appalling, it is hardly extraordinary. Politics is full of bad seeds (many of whom have names like Clinton and Bush and Kennedy) and far too much weight is given to non-issues these days.
However, when applied to the broader spectrum of American politics, the tactics employed by DeLay and his cohorts point to a disturbing trend - the erosion of standards. DeLay is the embodiment of the current GOP ethos that says "it's only bad when a Democrat does it."
This kind of slimy approach is unacceptable for any party, let alone one that prides itself on values.
Despite a lack of condemnation from those on the right, one would have to be blind to miss DeLay's overt duplicity. For example, he - a non-veteran who vociferously supports military intervention abroad - has seen fit to question the patriotism of those who served their country in uniform.
Say what you want about John Kerry's politics, but when his country needed him in Vietnam, he at least showed up. By many accounts, at roughly the same point in time - a time in which we were fighting a war and needed all the help we could get - DeLay was partying like a frat boy (his nickname, prior to becoming an Evangelical, was Hot Tub Tommy).
DeLay and others in the Republican leadership have also been sharply critical of recently departed Sen. Tom Daschle (D-S.D.), citing his objection to Bush administration goals and initiatives as the telltale mark of "an obstructionist."
Yet it was conservative Republicans who attempted to block intelligence reform, a bipartisan measure with White House support. Similarly, when President Bush attempted to extend his tax cuts to the lower class, DeLay blocked the move, stating, "it ain't gonna happen." If these shenanigans don't make DeLay and his associates "obstructionists," I don't know what would.
Another salvo fired by DeLay and Co. against Democrats was that they would put the interests of the United Nations before the interests of the United States. This is an apt point ... until one realizes that DeLay has essentially replaced "Israel" with "the United Nations" in the equation. A staunch Christian Zionist, DeLay has shown a willingness to put Israel's best interests at the forefront of all his important decision making.
Given that Israel is an independent nation with its own interests and priorities this is a questionable move at best (and I say this as an admirer and defender of Israel in my own right, but also as someone who favors national sovereignty and is unwilling to pretend the Lavon Affair never happened).
DeLay's pernicious hypocrisy has even spread upward to the Senate, where conservative Republicans bitched and moaned ad nauseum about Democrats who held up judicial or cabinet appointments of qualified candidates for political reasons.
Yet many of the same complainers attempted to stall the confirmation of Arlen Specter (R-Pa.) as chair of the Judiciary Committee. Specter is not only qualified (he worked with the Warren Commission), but he also earned his conservative stripes during his advocacy of Clarence Thomas's Supreme Court nomination. The attempt by pro-life firebrands to block his selection after chewing Democrats out for similar reasons reeks of two-faced cowardice.
What is perhaps most puzzling about the ongoing DeLay saga is why the majority of Republicans have continued to bend over backwards for him.
Were I a card-carrying member of the GOP, I would be embarrassed to have DeLay as my majority leader. His effectiveness at maintaining party discipline and whipping votes notwithstanding, his unethical behavior and loud-mouthed grandstanding (he once infamously declared, "I am the federal government!") is downright embarrassing at times.
However, rather than reprimand him, the Republican leadership has covered for him time and time again. Whether they are busy bringing charges against DeLay's accuser or attempting to alter ethics rules that would enable him to stay in power, Republican higher-ups have shown repeatedly that they lack the intestinal fortitude to do the right thing when it isn't politically expedient.
Instead, they prefer to allege that the charges against DeLay are "politically motivated." That's funny - I don't recall political motivation being a concern when Republicans were bringing ethics charges against Democrats Jim Wright and Barney Frank.
The argument loses even more muster when one considers that Judicial Watch, a conservative watchdog group, is suing DeLay.
Sadly, this fiasco-in-progress represents more than just DeLay's failure to conduct himself properly; it is a failure on all fronts.
It is the failure of Speaker of the House Dennis Hastert to adequately reign him in and assert proper decorum. It is the failure of Bush to establish himself as the head of the party and not lose ground to DeLay and others on the far right.
It is also the moral failure of any conservative who decried improper conduct in the past and yet remains silent now.
Interestingly enough, this represents a major failure for Democrats as well. While DeLay is very much a demagogue, the strategy of throwing everything but the kitchen sink at him and hoping it sticks is wholly inappropriate. This calls motives into question and makes DeLay look as if he is being martyred.
Furthermore, how can Democrats expect their ethical complaints to be taken seriously when they continue to count criminals such as Corrine Brown (D-Fla.) among their active ranks?
By the end of "Cop Land," the sheriff attempts to produce a witness that will take Donlan and his entire crew down. When Donlan gets in his way, Hefflin shoots him.
Republican leaders needn't go to such extremes in dealing with their majority leader, but the message to them is loud and clear: there should be no further delay in ousting DeLay.
(12/08/04 12:00pm)
"We're not monsters. We've just made bad choices." These are the words of Robin Easterling, a former inmate at Edna Mahan Correctional Facility for Women in Clinton. Easterling, along with fellow inmates Melvina McClain and Midge DeLuca, is the subject of "Freedom Road," a documentary produced by Lorna Johnson, assistant professor of communications studies at the College.
Nearly 100 people filled the seats of the Don Evans Black Box Theater on Friday night at 8 p.m. for a screening of the film. "Freedom Road" chronicles the progress of the "Woman is the Word" program. Led by Michele Tarter, associate professor of English, "Woman is the Word" is a memoir writing workshop taught annually in the maximum-security wing of the prison. Tarter and two of her students help inmates heal by writing and sharing their life experiences.
The program, which Tarter started in 1998 while in Illinois, begins by introducing inmates to women's autobiographies. Many of the women found themselves inspired by Harriet Jacobs' "Life of a Slave Girl." Juda Bennett, guest speaker and associate professor of English, raised parallels between the hardships faced by Jacobs and those in the women's own lives.
Next, the women were encouraged to confront their pain and write about what they had been through.
In the film, DeLuca, a former teacher convicted for vehicular homicide, expressed regret that she would never be able to teach again.
"I was undergoing chemo (for breast cancer) when it happened," she said.
McClain spoke openly about the abuse she had to endure throughout her life. "A relationship is prison," she said.
All of the women involved pointed to the program as a positive experience and one that changed their lives.
"A load has been lifted," McClain said. "I'm able to help and encourage others."
"Education is the one thing that has been proven to reduce recidivism," Tarter said when interviewed on the film.
Elaine Easterling, Robin's mother, agreed. She observed that those who are better educated are better able to help themselves in the world.
"Freedom Road" concluded with DeLuca's release from prison - riding along the path that gives the film its title - and Robin Easterling's vow to continue her education and live a better life. Tarter later revealed that Easterling earned her associate's degree as well as her release from prison and is currently living in a halfway house.
After the screening, a panel discussion was held featuring Tarter, Johnson, John Krimmel, chair of Criminology and Justice Studies and alumnae Crystal Walker and Monique Hankerson, who participated in the program with Tarter. Jessica Gill, Vice President of Sigma Tau Delta (STD), the English Honor Society, and organizer of the event, moderated the discussion.
Krimmel, who helped Johnson receive permission to film in the prison, spoke of the sociological effects of children being raised by their grandmothers while their mothers were incarcerated. During the film, an inmate expresses her reluctance for her son to see her behind bars.
Johnson discussed the making of the film from a technical standpoint.
"This is a lot more linear than my usual style," she said. She described "Freedom Road" as being very different than anything she had filmed before.
Walker and Hankerson described the ways the Woman is the Word program impacted them.
"It made me more aware of my privilege," Walker said.
"A lot of these women came from the same kinds of places I did," a teary-eyed Hankerson said. She described not being able to eat after a visit to the prison and encountering some of the women on the outside.
Tarter and Bennett spoke of some of the challenges they faced in teaching the class inside the prison.
"These women were not used to being praised," Bennett said.
"A lot of them are very passionate," Tarter said, describing a fight that erupted in the classroom over religion.
She added that while prison officials were very accommodating, she was often wary of leading things in a direction that would offend the warden.
"They wouldn't let us show the film inside the prison," Tartar said.
Tarter also expressed her enthusiasm about being able to continue the Woman is the Word program into the future. She already has two students lined up to assist her for an 8-10 week session in Spring 2005.
"It takes a community to build a program," she said, pointing to the support she has received at the College.
The screening of "Freedom Road" was co-sponsored by STD, Women in Learning and Leadership (W.I.L.L.), Krimmel and the Department of English.
(12/08/04 12:00pm)
The month of December is now in full effect and with it comes a series of virtual certainties. For example, it is a good bet that a number of people will be getting smashed during or after finals week in an effort to forget about all those grueling hours of studying and test-taking. Some will also spend New Year's Eve in a haze (be it literal or figurative) while pressing against warm bodies at a party or a club. And then there are others, such as myself, who will simply have none of it.
When it comes to hedonism - that is, copious amounts of sex, drugs and booze and other such debauchery - my attitude can be summed up in two words: no thanks.
Given some of the ideological stances I hold (I've argued for legalizing prostitution and against traditional notions of love), this might come as a real shocker. Why would someone who values freedom not partake in its many benefits?
In order to answer this, it is important to realize that I am not slamming hedonism from a traditionalist standpoint. Traditionalists, such as my colleagues McCaffery and Esposito, usually mean well, but offer arguments against hedonistic behavior that fall hopelessly flat.
The traditionalist argument often errs by presupposing nonexistent moral authority. "You shouldn't be doing this" comes off sounding obnoxiously paternalistic more times than not.
If we are all consenting adults, then not one of us gets to make the determination as to what the others should and should not be doing.
Traditionalist arguments also confuse paranoia and possibility with probability and fact. Can promiscuous sex lead to developing a sexually transmitted disease? Yes. Does it happen in all, or even most, cases? Don't bet on it.
I prefer to examine and, subsequently, criticize hedonism in a different light.
Consider, for example, that hedonism is inherently instinctual. "Do what feels right" is its driving principle and gratification of impulse is its goal.
Now consider this: we are all college students. We are here, presumably, to learn and apply, to think and evaluate and, most importantly, to get ahead in the world. If our instincts would guide us down the right path regardless, none of this would be necessary. And yet, it is.
Conclusion: doing what feels right won't necessarily yield a result that is satisfactory and short-term gratification can easily translate to long-term angst.
My critique of hedonism also rests with the concept of personal choice and accountability. As far as I am concerned, any person who is of age has the right to do as he or she pleases with his or her body, provided two conditions are met.
First, that person must take responsibility for his or her actions.
Second, those actions cannot negate the rights of others.
Hedonism is far too often treated as legitimized escapism. How many times has "I was drunk" been used to explain a mishap or snafu and how many times have a series of heads nodded in understanding?
Bullshit! "I was drunk" is not a valid excuse inasmuch as the drinker makes the choice to drink in the first place. As such, he or she is still responsible for whatever comes of that choice.
Furthermore, allow me to be blunt (pun intended): getting wasted or stoned often results in behavior that is loud, annoying and infinitely idiotic. The likelihood that this kind of behavior will disturb others is very high and the minute that it does become disruptive, the right to engage in it is null and void.
Or, to put it another way: any asshole whose drunken chatter annoys me should take it elsewhere lest, for the sake of reciprocity, I blast some Metallica at high volume while said asshole is trying to sleep.
Having choice requires us to be conscientious, even if it sometimes kills the fun.
Next, I am going to posit (to a chorus of boos and hisses, no doubt) that hooking up at random can indeed be disastrous. I believe the act itself is harmless, but the context in which that act is understood (or, more likely, misunderstood) can have potentially devastating repercussions.
All human relationships, be they friendships, business partnerships or romantic couplings, are inherently transactional. Something is given and something is received.
In order for a relationship to be functional, that which is given should approximately equal that which is received.
The value of sex is very hard to measure inasmuch as it might mean different things to different people. Thus, the safest bet is merely to exchange sex for itself.
The only kind of hook-up that can function properly is one in which sexual gratification is the goal of all parties involved.
Unfortunately, this isn't the case with all hook-ups. Sex is sometimes used to reward attention or flattery. It may also be used as a means of inspiring jealousy or elevating social status.
In some cases, the physical act is even presented as "proof" of an emotional attitude.
Because this violates the exchange principle, it is dreadfully problematic. That which is given may not equal that which is received in the eyes of both participants.
As such, someone can very easily believe he or she has been used or manipulated and discord ensues. Someone simply looking to get laid could wind up with far more (or less) than he or she bargained for.
Ergo, people should either have a very clear understanding of what they are getting themselves into ... or not get involved at all.
Finally, I am going to close this piece with an admission. I lead a fairly boring, mundane life. I don't go on great adventures or drive from town to town in a van solving mysteries.
It is probably good that I realize this. In realizing it, I also realize that downing shots of vodka, toking up or throwing myself at a miniskirt-wearing (preferably blonde) vixen isn't going to make my life any more spectacular.
These escapades might help me forget how boring it is for a short while, but eventually I will return to reality and I'll probably have a headache or a hangover or some lingering doubt or regret when I do.
And all of that static, quite simply, is something I can do without.
(12/01/04 12:00pm)
Beware the pending storm. The nomination of Alberto Gonzales for U.S. Attorney General is likely to produce a whole heap of controversy. Gonzales' detractors will point to his somewhat shaky civil liberties record whereas his supporters will claim that he is a defender of constitutional rights.
In either case, the motives are purely political. Neither liberals nor conservatives respect the sanctity of rights. Instead, they merely prefer to use rights talk as a convenient cover for the cause of the hour.
In a way, the politicization of rights represents education in reverse. As schoolchildren, we are taught the Bill of Rights and we come to understand that it is something sacred. Somewhere between then and adulthood, that understanding is lost.
The First Amendment is a classic point of contention. Liberals will dutifully defend the rights of communists and shock artists to express their views, but seem to recoil when confronted with hate speech or the mere mention of religion.
The same provision that allows for individuals to call the president a racist, for example, allows a racist (such as David Duke) to run for president.
Furthermore, separation of church and state does not circumvent the fact that religious expression is constitutionally protected. As long as it's entirely voluntary, there is absolutely no reason why children can't pray in public schools if they so desire.
Conservatives fare no better with regard to free speech claims. They will rightly denounce political correctness as an unjust abridgement of free speech, only to turn around and become censors themselves when Howard Stern, Eminem or Michael Moore says something that isn't to their liking.
The same right that allows Michael Savage to spew inflammatory lies over the airwaves allows Janeane Garafalo or Al Franken to strike back.
Additionally, the door of political correctness happens to swing both ways. If conservatives truly wish to abolish it, then they will have no recourse against me calling them Bible-thumping Jesus freaks and giving a big "fuck you" to anyone who doesn't like it.
Sadly, the rest of the Bill of Rights fares no better. Chanting the mantra of "gun control," liberals have begun to wage war against the Second Amendment.
Legitimate gun control means keeping guns out of the hands of children and convicted criminals (i.e. those without a guaranteed right to bear arms). What passes for gun control nowadays seems to mean keeping guns out of the hands of adults as well.
As a law-abiding citizen, I have the right to arm myself as I see fit. Denying me that right because I might use a weapon to commit a violent act is essentially punishing me for a crime I have not committed.
Besides, if I really was inclined to shoot someone to death, all of Sen. Lautenberg's gun control legislation wouldn't be able to stop me from going into Newark and getting a gun illegally.
Substitute "gun control" for "law enforcement" and the Second Amendment for the Fourth and conservatives seem to have the exact same problem. They seem overly enthusiastic about tap-dancing around the protections against searches and seizures in an effort to lock up more criminals, arguing that drug dealers/murderers/random bad guys don't have a right to be secure against warrantless searches. They then have the nerve to refer to this as "law enforcement."
First, until a person is convicted of a crime, he or she is not a criminal but a citizen, and a citizen does have that right. Secondly, law enforcement refers to upholding and enforcing the law, not breaking it.
Since the Fourth Amendment is the law, attempts by the authorities to flout it are inexcusable. Undoubtedly, policing is a tough job, but I refuse to believe that it can't be done without violating the Constitution.
Perhaps even more jarring than the disregard liberals and conservatives show to actual rights is their tendency to advocate fictitious ones.
For liberals, rights seem to arise from a state of perpetual victimization. In their attempt to correct historic wrongs and promote a nonexistent equity, liberals have taken to advancing privileges and demanding they be called rights.
Gay marriage exemplifies this rationale perfectly. Same-sex couples across the country are routinely (and unjustly) screwed out of benefits shared by their heterosexual counterparts. As such, liberals have tried to address the issue by insisting gays have a right to marry.
As I mentioned in a previous column, marriage is not a right but a privilege. Any two consenting adults have the right to live as a married couple, but marriage requires government recognition of that union and the granting of benefits in accordance with it.
The government (via the will of the people) defines marriage as being between a man and a woman. Ergo, attempts to declare gay marriage as a right are hopelessly facile.
Liberals also labor under the delusion that there is an explicit constitutional guarantee to privacy. While it can be argued that such a guarantee should exist (and I believe it should) or that the Founding Fathers intended for it to exist, the fact of the matter is that it does not. Any law that emanates from this invisible protection - including the Roe v. Wade decision - is thereby inherently flawed.
As a trained lawyer, Sen. Kerry should have known better than to claim during the presidential debates that abortion was a constitutionally protected right.
The fabrication of rights does not end there. Kerry has also claimed that health care is a right. I wonder if he'd mind telling me which part of the Constitution specifies this guarantee.
By the time a final tally is taken, a second Constitution might be needed just to keep track of all the rights liberals have invented for themselves.
Needless to say, conservatives are no less likely to make up rights as they go along. Their strategy is an uneasy blend of tradition, populism and religious scruples.
They far too often mistake personal (or even popular) preference as some kind of legal guarantee and seem outraged that these preferences aren't treated as legitimate law.
This is wholly evident in their continued advocacy for nonsensical "rights of the unborn." Just as liberals will try to change the definition of "marriage" to meet their ends, conservatives will try to change the definition of "person" to meet theirs.
A two-celled zygote with no consciousness is not a person any more than a mere seed is a 250-foot sequoia.
Ergo, the zygote is not eligible to receive the same rights as a person (rights that become available to that person upon birth).
Any attempts to ban abortion at a federal level are especially malodorous considering that the issue is beyond the constitutional purview.
Conservatives have also attempted to lay claim to so-called "family rights" and "community rights" to purge society of that which they deem offensive.
This back-door prelude to fascism has been used by groups like the Moral Majority, Empower America and the Parents Music Resource Center to attack the entertainment industry, pornography and recreational drug use.
Not only are such "rights" extra-constitutional, but they also ignore the fact that both families and communities are comprised of individuals.
As an individual, it is up to me to determine what I find offensive. If I wish to indulge in porn (which I don't) and watch violent, profane movies (which I do), such is my right. If another person finds that objectionable, that person needn't have any part of it. However, that person has no right - constitutional or otherwise - to block or impede the manufacture of such material or prevent me from receiving it.
The philosopher John Stuart Mill once wrote "That the only purpose for which power can be rightfully exercised over any member of a civilized community, against his will is to prevent harm to others. Over himself, over his own body and mind, the individual is sovereign."
This view became the precursor for the concept of negative rights, a concept that was embraced by our Founding Fathers. In other words, whatever the government doesn't tell you that you can't do, you can do and whatever the Constitution doesn't tell the government it can do, it can't do.
As simple as it sounds, liberals and conservatives have bungled, manipulated and utterly savaged this principle beyond repair. They attempt to turn bona fide "cans" into "can'ts" and create new "cans" from thin air.
Throughout all the confusion and obfuscation and legal maneuvering, one thing has become clear: I have the right to be outraged about this and so do you.