Tuesday, April 30, 2013

Classic Albums: Traffic, John Barleycorn Must Die


Ever since I acquired a turntable a few years back, I have been exposed to music I'd overlooked in the past.  With used records sometimes just costing a couple of bucks, I've paid for some pretty fantastic wax for less than the price of a cappucchino, and experienced new musical loyalties to boot.  One such discovery has been the band Traffic, which for years I'd only thought of as that band Steve Winwood was in before he hit the big time as a solo artist in the 1980s ("Roll With It," "Valerie," "Higher Love," etc.)

My discovery of said band in many ways constitutes the best argument in favor the continuing existence of record stores as opposed to our ephemeral digital music worlds.  I was in the Princeton Record Exchange, just checking the place out, but noticed that the record being played by the store clerks was really, really cool.  I discovered it was a Traffic live album, and my interest piqued, I started delving into the band's oeuvre.

Very quickly John Barleycorn Must Die wormed its way into heavy rotation in my house, even if my favorite Traffic song was the title track to another record, The Low Spark of High Heeled Boys.  I loved Barleycorn because it is a record that defies categorization and jumps over all kinds of genre boundaries.  It's a deceptively simple album, with only six tracks and a cover designed to make it look like a burlap sack.  You expect something folky, and the first song turns out to be "Glad," an up-tempo jazzy rocker driven by piano and saxophone, the guitars practically in hiding.  It's an instrumental track, something very rare for a rock band to put on record, much less as the opening song.  The next track, "Freedom Rider," has lyrics, but retains the same loose, improvisational feel.  This time around, there's a flute in addition to the saxophone, giving the song a more whimsical vibe.

The following track, "Empty Pages," isn't as purely jazzy, but it's got a real funky swing to it.  It also might illustrate why Traffic's music hasn't had the staying power it deserves.  This record came out in 1970, at a time when mainstream rock music was either heading into a hard, heavy direction (Led Zeppelin, Black Sabbath, Humble Pie), into the prog rock stratosphere (Pink Floyd, Yes)  or had a folksy, rootsy vibe (The Eagles, Grateful Dead, Neil Young).  Perhaps Traffic's records aren't sought after or treasured because the band's combination of jazz improvisation, psychedelia, and funky R&B ventured way too far off of the paths being forged at the time.

After starting off the listener with some killer, jazz inspired improvisation on side one, the record shifts gears on side two with a more conventional, easy-rolling rocker, "Stranger to Himself."  The driving piano is still at the heart of it, though, and the song still allows space for the musicians to stretch out and jam a little.

The following song, the title track, takes things in a much folksier direction with an actual folk song.  The song itself is a beautiful, elegant mesh of acoustic guitar picking and haunting flute.  Though you might think it's about murder, the whole thing is an elaborate metaphor for the harvest, when all the barley will be cut down by the scythe.  The song strikes an appropriately mournful tone, since we, like shafts of wheat, will too be cut down in our time.

The album ends with "Every Mother's Son," which melds some high, lonesome singing with psychedelic guitar and hippy-dippy lyrics.  It's a pretty, majestic end to a quietly stunning record, and one that more people ought to have in their collections.

Sunday, April 28, 2013

Nostalgia for Living the Low Life

Now that I am married with kids and have a job teaching impressionable teenagers, I have largely abandoned my rough and rowdy ways.  It's been awhile since I have been up late enough to see the sun.  I can't remember the last time that I woke up feeling like troll used my head for an ashtray.  Hell, I can't even remember the last time I woke up after 7:30!

Back in my old days, especially when I lived in Michigan, I often found myself alone at diners and bars having conversations with complete strangers.  They were my surrogate family, having left my old grad school clan behind to take a job in the Wolverine State, and before I met a couple of great friends there. However, these friends had wives and fiances, which meant many Friday and Saturday nights all by my lonesome.  Around this same time I first picked up a Charles Bukowski novel (Factotum, to be exact) and I fell in love with the idea of being a permanent, lifelong bohemian bachelor.  I also happened to enter a major Tom Waits phase, so I interpreted my late night conversations with a bartender who looked like Bob Seger's long lost cousin to be a kind of romantic rebellion.  That romance included eating twice a week at a place that specialized in coney dogs where I was always sure to hear some funny jokes told at the counter told by weary working class philosophers.

In the midst of my Bukowski-Waits phase, I started to romance my now wife, even though she was out in New Jersey.  This meant I no longer searched for love in all of the wrong places that my Rust Belt city had to offer, but that there were plenty of lonely nights to be drowned in whiskey and beer.  These nights all came flooding back to me two weeks ago, when I was running some errands, and my wife was nice enough to let me take a detour to a craft beer bar in the area.  Lo and behold, there was a range of interesting characters with their feet on the rail and stories to be told.  I heard tales of decadent clubs in Newark in the 80s and commiserated over how bad the Devils have sucked this season.

As much fun as it was to remember my old life, it felt good to leave after an hour, and come home to my family.  It's easy to romanticize shooting the breeze over a pail of suds with articulate strangers, less so coming home to an empty apartment and the need to pass out lest the pains of solitude intrude.  I miss living the low life sometimes, but it can't beat the love of a wonderful woman or the wide-eyed looks I get from my daughters when I walk into the room.  When I ever get nostalgic for my rough and rowdy days, I just remind myself what going to bed is like now, and the feelings of dread and loneliness that used to grip my soul.

Friday, April 26, 2013

Track of the Week: The Fall, "How I Wrote Elastic Man"



Around this time two years ago, I came clandestinely from East Texas to the Big Apple to interview for  my current job.  My life had become completely intolerable, so much so that I was totally willing to take the plunge and vacate my once cherished place on the tenure track at a university in order to teach at a private high school.  I was 1500 miles away from my life in a wretched, isolated town working in a department that contained some of the worst bullying assholes I've ever had the displeasure to know.  Each day I came to work afraid of the unpleasant surprises that might be waiting for me.

April and May of 2011 also happened to coincide with my rediscovery of The Fall, a band I really hadn't listened to much since the 1990s.  I was listening to their music on a near constant basis, and blasted some of their louder songs on my way to work in the morning as a way to steel my nerves.  After all, I might have to see the full professor who would literally turn his back on me when I entered the room (because I was friends with someone he disliked), talk to the medusa assistant prof who was machinating to get one of my friends (who was not tenure track) fired because her competence threatened the mediocre medusa's position, or be forced to make conversation with a chair who had twisted the arms of colleagues to reveal that I was unhappy in my job, and then ambush me in my office, sputtering with rage, when he heard that I was displeased with having to teach a majority of classes outside of my field.  I also may well have had to see the guy who ratted me out, and who had also sent an email to several of my colleagues mocking me and my wife in an insulting fashion.

The Fall's controlled chaos seemed fit for what I was going through.  Lead singer and impresario Mark E. Smith's sneering, poetically profane putdowns gave me strength, and put me in the right frame of mind for dealing with the absurd.  "How I Wrote Elastic Man" is especially interesting, since Smith sings it from the point of view of a popular novelist who is loosing inspiration.  I had lost inspiration of a different sort, and could gleefully sing along to lines like "fuck it, let the beard grow."  The song surges forward with a kind of crazy intensity, which well reflected my fever-pitch anxiety over leaving academia without knowing where I'd end up.

That intense fever broke in early May, when I received the job offer.  I spent a handful of precious days walking around campus with the knowledge that I was about to fly the coop, and barely able to contain that fact.  Since graduation was at the end of the week, I did not want to go public just yet, lest I have to deal with some highly awkward situations at the graduation ceremony.  That morning I had a Bloody Mary with my breakfast, then hopped in my car wearing my ill-fitting doctoral robes for the last time, blasting "How I Wrote Elastic Man" out of the window, feeling freer than I had in years.

Thursday, April 25, 2013

A Short History of the Bush Administration

Today George W. Bush's presidential library was opened, and there has been a predictable attempt by his allies to whitewash his record and forget what a colossal, unmitigated disaster he was as president.  I will not sit idly by and let this Orwellian storm of obfuscation and bullshit rage on without putting up a fight.  What follows is something I wrote on my old blog shortly after Bush left office, and ought to remind us of all the things that Bush and his minions would like to have us forget.

*****
On Tuesday George W. Bush finally left the White House, going away to Texas on his helicopter after getting a stern rebuke from President Obama in his inauguration speech and a real Bronx cheer from the assembled masses. Almost better than him leaving was the fact that he had to sit there while his successor lambasted the Bush administration's policies in front of adoring millions that ate up his every word.

So what of the Bush administration? The man himself has been fond of saying that "history will judge" his actions. Historically speaking, only the most reckless and lawless leaders ever say such a thing, usually to justify actions acknowledged to be beyond the pale in the present. Now that his administration is in the past, let's start judging now. I think his administration falls into five distinct periods, each with their own outrages associated with them.

Part I: Before 9/11
Bush came into the White House under a cloud of suspicion after getting the presidency awarded to him by the Supreme Court after losing the popular vote. In the 19 months between his inauguration and the terror attacks, he did little to instill confidence in the American people. Instead of acknowledging the meager nature of his mandate, Bush immediately set about pushing a hard-Right agenda. After having inherited a massive budget surplus, he squandered it via tax cuts that went mostly to the wealthy. (That money might have been handy later to pay for his wars and rebuilding New Orleans.) The public wasn't actually clamoring for tax cuts at that time, and by the summer of 2001 his popularity had fallen. Apart form fiscal irresponsibility, his Attorney General, the Puritanical John Ashcroft, expended valuable resources fighting head shops and naked breasts on statues rather than on the terrorists who a memo had told our president were "determined to strike in the US." Although he tried hard to distance himself from the man in every way, on September 10, 2001 Bush looked like he would be just like his pappy in at least one respect: he would be a one-termer.

Part II: 9/11 to November 2002
The terror attacks saved Bush's political life, despite his less than gallant immediate response to them. The public understandably rallied around its leaders in the aftermath of 9/11, and Bush exploited the situation masterfully. The American public was scared shitless, and Bush soon learned how to manipulate a public too high strung and paranoid to dig beneath the facade. It also must be said that Bush's common touch with the workers at Ground Zero actually made him look like a competent leader. (I was never fooled, but I will admit that he exuded genuine empathy and spirit at a tough time.) Underneath the surface, an almost unreported war on civil liberties began, with thousands of foreigners detained without just cause. Then, right before the election, the war drums on Iraq began to beat, allowing Republicans to catch the Dems flatfooted and portray them as dangerously unfit to prosecute what was already being called "The Global War on Terror." That narrative led to the most execrable political ad since Willie Horton, the "Osama" ad targetting Vietnam vet and Democratic Senator Max Cleland. All of this distracted the public from the fact that Bush failed to hunt down bin Laden, despite his overblown rhetoric.

Part III: November 2002 to Katrina
This period was the reign of the "war president," a time of collective national insanity and the most damaging of Bush's term in office. The Congressional elections upheld Bush's mandate, and as the push for war in Iraq got more intense, the mainstream press got even more compliant. Things got so crazy that those who publically dissented, like the Dixie Chicks, got broken on the wheel. (In retrospect, because of their respective opposition and support for the war, this is also the moment where Barack Obama started his rise to the presidency, and where Hilary Clinton lost it. Whoever said virtue never gets rewarded over opportunism?) Every cable news station without exception had an American flag flying in the corner of the screen after the Iraq invasion, and just about every single news outlet failed to ask tough questions on WMD. In fact, the supposedly liberal New York Times' Judith Miller became one of the biggest purveyors of propaganda. In the face of this onslaught his political opposition ran scared, even though it was obvious to yours truly and many others at the time that an invasion of Iraq would create more problems than it would solve, even if Saddam had WMD, which was sketchy in the first place.

While the American military did a splendid job defeating the Iraqi army, it was apparent from the start that the architects of the war had no clue what they were doing in the long term. (Those who did, like Eric Shinseki, got kicked to the curb for daring to challenge Rumsfeld's crew.) The massive looting did incalulable damage, and he response that "freedom is untidy" showed blithe uninterest in the face of a total lack of authority on the streets of Baghdad. That's not "freedom," that's anarchy, which was just as or even more harmful than the tyranny it replaced. In the short term, however, Bush staged the piece de resistance of his administration, the moment that future students of history will snicker at with gusto: his landing on the aircraft carrier in front of the "Mission Accomplished" banner. Ironically, while he was never higher in prestige in real time than at that moment, in the nation's historical memory, he will never be lower.

The reality of the war soon erased his moment of triumph. Iraq devolved into a deadly guerilla war that killed and wounded more and more Americans. The UN, which the US expected to handle the civilian postwar situation, had to leave after a horrific terror bombing. Bombings soon became a depressing daily occurrence, and the WMD did not materialize. Saddam was captured, but his trial soon became messy and a showcase of the new Iraqi government's disfunction. To top things off, in 2004 the photos of Abu Ghraib hit the international scene like a bolt of lightning. In the last election and even today Shrub's dwindling band of dead enders like to point to the Surge and the Anbar Awakening, but these fixes came only after years of pigheaded denial of reality and wrongfooted incompetence that cost thousands of lives and soiled our nation's reputation. Many of our soldiers died because they were not given the proper protection, and yet when questioned why this was the case by one of these very soldiers who was putting his life on the line, Rummy told him "you go to war with the army you have."

Yet, despite the deterioration in Iraq, Bush managed to win re-election. John Kerry's ineffectual performance had something to do with this, as well as the fact that casualties in Iraq really shot up after the Samarrah attacks in early 2005, well after the election. Bush certainly reminded all and sundry that he was a "war president," winning moderate voters afraid of switching leaders at a crucial juncture. While picking off some of these voters, Karl Rove nakedly appealed to the lowest instincts of the religious right by making gay marriage the second-biggest issue of the campaign. Not only had Bush led America into a bloody needless war, his surrogates spread scurrilous lies about John Kerry's military record and shamelessly exploited homophobic bigotry. Bush somehow managed to make himself seem above it all, though he benefitted from the smears and hatred deployed in his name. (Let's also not forget that the GOP did a passable job of making their 2004 convention look diverse despite the party delegates being a practical whitewash, and that the moderation of immigration rhetoric by Bush also gained Latino voters.)

The media again came to the rescue, in the form of Dan Rather's maladroit investigation of Bush's own less sterling military record. We seem to have let this moment go down the memory hole, but Rather had the chance to expose what everyone and their pet goat knows to be true: Bush had been placed in the Texas Air National Guard due to his father's connections, and while in the Guard had not properly fulfilled his duties. The contrast between the chicken-hawk "war president" and his decorated opponent could not have been more stark. However, Rather did his job sloppily, and instead he, not the president, was disgraced. This didn't change the fact that the charges agaisnt Bush were basically correct, but the whole episode made him out to be a victim and protected what should have been an Achilles heel. At that point, his electoral victory was a sad inevitability.

Part IV: Katrina to Rumsfeld's Firing
Bush had already started to slip early in his second term, misinterpreting his close win against an inept challenger as the acquisition of "political capital." Because of this he heedlessly pushed Social Security privatization, something that no American outside of the Heritage Foundation and Cato Institute actually wants to see. The whole Terry Schiavo drama (another forgotten moment) also made his heavy ties to the Bible thumping crowd look dangerous. At this point Bush was headed back to his political situation in the summer of 2001: an amiable buffoon whom the public did not revile, yet certainly did not support with any enthusiasm.

Then came Katrina. His ineffectual response while a great American city drowned and thousands of American citizens were left to fend for themselves in horrific circumstances disgusted just about anyone with an ounce of humanity in their hearts. The revelations about "Brownie" made it clear to a great many people how badly the president ran the government. This even may have even begun a wider discrediting of the Reaganite anti-government neo-liberalist philosophy espoused by the Bush administration.

In the face of such a failure, Bush upped the ante by attempting to have Harriet Miers, his personal attorney, made into a Supreme Court justice. The Left and the middle attacked him for his crass cronyism, and the Right recoiled at Miers' lack of credentials on abortion. At this crucial juncture many of his natural allies began to realize that Bush was political kryptonite, and started distancing themselves as fast as they could before the 2006 election. Part of that distancing and lack of party discipline meant a hard tack to the Right on immigration by Congressional Republicans. This, by the way, eliminated one of the few bright spots in Bush's tenure: an honest attempt at a sensible immigration policy. (On top of all of this, the Libby scandal made public a culture of arrogance and law breaking, and the Jack Abramoff trial unearthed oodles of Republican corruption.)

The public didn't take the bait, and the GOP got slammed in the 2006 election, despite uninspiring leadership from Reid, Pelosi, and company. After that election Bush made his first real bow to the public anger at his administration, he forced Donald Rumsfeld to resign.

Part V: The 2006 Election to Obamageddon
From that point onward Bush settled into a long, torturous two-year lame duck era, perhaps the longest in presidential history. The revelations of wrongdoing kept coming fast and furious, too many for me to relate here. (The worst is probably the firing of US attorneys on ideological grounds.) Oh yeah, the vice-president also shot a guy in the face and seemed pretty unapologetic about it. This happened around the time that his chief of staff, "Scooter" Libby, finally got taken down (at least before Bush's pardon), and cemented Cheney's status as the most reviled and distrusted politician in America. The economy went into recession, a fact that Bush tried to deny right up until the financial meltdown caused in part by his lax regulatory policies.

Conclusion
I guess this wasn't such a short history. Sorry for that, but there was just so much malfeasance to cover that it couldn't adequately be done briefly. The final verdict of this historian is that George W. Bush stands as one of, if not the most disastrous presidents in our nation's history. (Only Buchanan, Nixon, and Harding are really close, and they didn't stay in for two full terms.) He has flouted the rule of law, violated human rights, started endless wars, consistently mangled the English language, exploited bigotry for political gain, rewarded ideological zeal over competence in his administration, ignored global warming, waged a jihad on science, fiddled while our economy burned, tilted the tax code in favor of the wealthy, and embarassed and tainted America in the eyes of the world. Good-bye, good riddance, and do us all a favor and just go away.

Tuesday, April 23, 2013

Hanging Up My Academic Spikes

I wrote awhile back about the similarities between careers in academia and professional baseball, and I keep finding more and more parallels.  This was especially on my mind last weekend, when two old grad school friends who teach at a small college in eastern Pennsylvania came for a visit.  It was mostly for fun, but also for business.

One of my friends didn't walk for his doctoral graduation, and now that he is contractually obligated to attend his current institution's ceremonies, needed the requisite robe, hood, and tam.  He's a tall man like me, and I hate to see someone shell out the kind of dough I had to on such frivolities, so I was glad to give him my doctoral duds on permanent loan.  Like most assistant professors, he's not paid enough to drop some serious money on a medieval costume.

I must say, I felt an odd twinge inside me as I handed over the robe and my friend put it in his car.  As I walked back to my apartment after saying good-bye to my chums, it hit me.  I had, to borrow a phrase from baseball, hung up my spikes.  It's the term we use when a player has definitively retired, with no hope or desire to get back into the game, or with the knowledge that age and decline have made it impossible to keep going.  After almost two years after physically leaving the university, I had symbolically and psychically cut my ties with the academic profession.

It's telling that the twinge I felt on Saturday has not lasted.  I used to be much more bitter, angry, and sad when contemplating my departure from academia, considering that I gave up financial solvency and my youth to pursue the impossible scholarly dream.  I thought of how I postponed necessary dental work (which could have very well lead to life-threatening abscesses), lived in three states in the space of five years, and lived 1500 miles away from the love of my life for the bulk of three years.

Nowadays, I just feel fortunate.  I work at a job I love with fantastic students who actually care about learning.  My employer appreciates my hard work, and gave me more positive reinforcement in my first two months on the job than I received in my five years as a professor.  Best of all, I am living where I want to live, not in some benighted backwoods East Texas burg.  Instead of driving five minutes through strip mall hell to my job, I take a train into the Big Apple, and luxuriate in getting to experience the nation's great cultural nerve center on a daily basis.  I am no longer a scholar, but I am much happier than I was on the tenure-track, so much so that I no longer think I am missing out on anything.

It pains me that so many of my friends in the profession are struggling.  Those with tenure or near it at podunk schools with low pay and high workloads are the lucky ones, since they at least have steady gigs.  Today I found out that two of my friends who are doing visiting jobs this year are scrambling to find jobs for next August, despite their status as (literally) award-winning teachers and authors of fascinating and relevant scholarship.  Many folks out there want to pretend that academia is a "meritocracy," but it's hard to conceptualize any meritocracy where people who are fantastic at their jobs must labor in penury without security or recognition.  It's a completely broken system, but it survives because of the allure of the scholarly dream.  Once I escaped the profession's cult-like clutches,  however, its utter ridiculousness and exploitative nature became even clearer to me.

I am much happier and fulfilled not trying to live that dream anymore, and living a life I never dreamed of, but am very satisfied with.  I know that job opportunities for former academics are not easy to come by, but I implore my brethren in the academic world to consider getting out and hanging up their robes while they still can.  As the old saying goes about other forms of dangerous, self-destructive behavior, the life you save may be your own.

Sunday, April 21, 2013

Track of the Week: Boston, "Hitch a Ride"


I have many rules of thumb, one of them being that bands named after geographic places are really lame.  Don't believe me?  Well witness this: Europe, Asia, Chicago, and Kansas.  For the longest time I would have included Boston on that list.  Brad Delp's vocals were ridiculously high, the lyrics paint by numbers, and the records produced within an inch of their lives.  The band was basically undone by the studio perfectionism of guitar savant Tom Scholz, who took eight years between 1978 and 1986 to complete their third record.  This is not the kind of spontaneous, joyful noise rock and roll was intended to be, but an engineering project.  It's only fitting that their album covers featured spaceships, rather than human beings.

That all might be true, but there are some moments of real beauty to be found in Boston's sound.  Back when I lived in Michigan, I had a half-hour car commute where I usually listened to the local classic rock station.  It was during that time that Brad Delp died, and the station played a lot of Boston songs in tribute, including many I'd never heard before in my years of listening to classic rock radio.  One that really put its hooks into me was "Hitch a Ride," from the debut album.  It starts with a beautiful, ringing acoustic guitar line, and while it builds up to an electric opera, it does so with less of the over the top bombast that marks songs like "More Than a Feeling."  It takes its time getting there, luxuriating in the mellow, sun-drenched sound of 70s studio rock.  The solo at the end, backed by an armada of mutli-tracked guitars, is pretty mind-blowing, but it gets its power from the fact that it's not unleashed until the last moment.  It's a lesson in the use of crescendo and dynamics that most hard rock bands in the 70s seemed to have missed.

Friday, April 19, 2013

An Elegy for a Friend

Note:  My friend David died rather suddenly and completely unexpectedly last December.  I still feel aftershocks from that event, and I experienced a pretty rough one this week when his widow posted an old picture of him to Facebook.  I am beginning to think that the pain of losing a close friend never really goes away.  I am beginning to understand why my mother has always acted as if the death of her best friend in high school had happened yesterday.  In any case, I wrote the following to be read at Dave's memorial service in Milwaukee.  I attended the service in Omaha for his family, and my comments there were more impromptu.  In fact, my eulogy ended at the point where I started bawling and could no longer continue.  Four months later, I still feel his absence.

****

Of all the people I have ever known in my life, David was perhaps the most unique.  I have never met another person like him before or since, and always admired the courage of his individuality.  He was an unforgettable person.  My parents barely knew him, but even they immediately recalled stories of him after they learned of his passing.  However, unlike a lot of highly intelligent people who pride their individuality, David never seemed to harbor a single malicious feeling in his soul.

We met as 18 year old college freshmen, and we although we shared some cultural interests, we spent a lot of our time back then turning each other on to new things.  Without David I never would have known about films like the The Warriors or learned to have appreciated heavy metal, especially the Ronnie James Dio iteration of Black Sabbath.  Although I had read some of Nietzsche’s philosophy before we met, it is through David that I really began to understand it.  Our intellectual conversations are perhaps the deepest I have ever had, and I feel guilty to this day that I was only able to complete about fifty pages of Heidegger’s Being and Time, a book he purchased for me and never stopped talking about.

We both moved to Chicago at the same time, attending two different universities, me on the South Side and him on the North.  We had a lot of adventures together that year, and roomed together the next in an apartment in Rogers Park.  That year, which I spent working and Dave completing his master’s degree, was one of the most fulfilling of my life, in large part because I was living with David.  We traded thoughts, music, and films together, and often ended our days sitting on the back porch of our apartment, engaged in conversation.  We philosophized in dive bars, read together in coffee shops, and sustained ourselves on the greasy food of the local diners.  Of course, we also clashed over cleaning the apartment, where David in his characteristically witty fashion labeled me an Apollonian Bert and himself a Dionysian Ernie.  I have long looked fondly back on those days, and am still coping with the fact that I no longer have my compatriot to share those memories with. 

The biggest event of that year was surely David and Michelle meeting for the first time, which I can claim to have facilitated.  After conversing over the then newfangled invention of the internet, the two saw each other in the flesh after I drove Dave out to Iowa in my car.  I had another friend with me, and when we left Dave in Cedar Falls to his own devices, he had such a look of silent anxiety in his eyes.  The man was normally as unflappable as they come, so I must say I was a little worried.  When I showed up the next day to pick him up, his face looked quite different, and I knew that I had just done a little bit to make something really great come to life.  We took other road trips out to Cedar Falls, and composed our own dueling mix-tapes for the occasion.  I loved seeing David so obviously happy, and he and Michelle’s relationship gave me some hope for my own romantic future, which looked pretty bleak at the time. 

It was sad to say good-bye to Dave and Chicago, but we were lucky enough to see each other many times again, be it in Omaha, Milwaukee, Champaign, or even Berlin.  I cherished those visits, and when I moved from the Midwest to Texas and then New Jersey, I felt the absence of our time together, however infrequent it had been before. 

Even if I never got close to finishing Being and Time, I knew from our discussions that Martin Heidegger urged that one should live a “resolute” existence, to really mean what one does.  David certainly lived up to that ideal, which I know he set for himself.  To cite just one example, he did not just accept the exploitation of academic laborers like adjunct professors and graduate students, he actively fought against it.    He was more true to himself than I could ever hope to be, and it rips my heart out to think that such a wonderful person and true friend is no longer with us.  I can only say that I was supremely lucky to have known this great man as long as I did

Thursday, April 18, 2013

Cranky Bear Holds Congress in Contempt

Note: My friend and associate Cranky Bear has been sending me plenty of missives by carrier pigeon over the last few months, but I have been trying to make this site less profane, and more professional.  However, in the aftermath of Congress' vote yesterday, his is the abrasive voice that needs to be heard.

********

Cranky Bear here, sipping bourbon and about ready to blow my stack.  Good old Cranky has been pretty desensitized to the braying idiocy of humanity and its tendency to be its own worst enemy.  However, there are times when even I am surprised at the base stupidity of our elected dipshits.

In case you didn't know, ninety percent of Americans support universal background checks for the purchase of guns.  They find keeping deadly weapons out of the hands of psychopaths, domestic abusers, and convicted felons to be such a common sense proposition that most people think we already have such safeguards in place.  Only if that were so.

I honestly can't think of any issue where 90% of the country agrees.  Hell, there are probably 15% who are so irascible that they don't think baby kittens are cute, or hold that the Star Wars prequels were superior to the originals.  Normally when the public lines up so definitively behind an issue our Congresscritters get so overanxious to please their constituents that the queue to the grandstand goes around the block.  For instance, just witness how public hysteria in the aftermath of 9/11 led to the horrifyingly sweeping PATRIOT Act.

Congress hasn't been too slow to act when it wants to, despite our assumption that it's incapable of action.  For instance, Congress managed to pass a massive giveaway to the evil motherfuckers at the Monsanto corporation in the dead of night as if by magic.  Corporations are people, after all, and furthermore, to quote Orwell's Animal Farm, "some animals are more equal than others."

If Monsanto needs legal protection for its nefarious deeds, fine, but if twenty innocent first-graders are torn to shreds by bullets fired from a weapon meant for the battlefield by a psychopath so loopy that his actions came as little surprise to those who knew him, that's just their tough fucking luck.  Next time pack some heat, assholes.

Those who voted against background checks might object to that characterization of their stand, but as far as I can tell, they are motivated by one of three factors: 1. Licking the ass of the gun lobby in return for blood money 2. An insane interpretation of the second amendment that makes it easier to buy an assault weapon than to register to vote 3. Absolutely craven fucking cowardice when it comes to standing up to the rabid gun nuts.  Their votes are based on either corruption, stupidity, or fear, which are never good reasons to do anything.

I am not so naive to think that our country is really a democracy, but I think that yesterday's events ought to have finally torn the mask off for anyone who had any doubts.  The overwhelming will of the people for a common sense reform of a system that leaves tens of thousands a year dead from gun violence was no match for a powerful demagogic lobby representing a small faction of shit fer brains whack jobs that's got wads of dollars bills it can peel off.  At least we won'y have to argue anymore whether our system is a catastrophe or not, just on whether it is merely fucked up, or Lindsey Lohan fucked up.

Well Congress, I do have to give you some credit.  Your approval ratings are just south of those for root canals and urinary tract infections.  It seemed it would take a miracle to get them to go any lower, but you actually figured out a way to do it, namely to ignore the vast majority of the people you supposedly represent, so you could prevent any solution to the bloody violence that claims new lives everyday.  Congratulation-fucking-relations assholes, I thought it couldn't be done.    

Tuesday, April 16, 2013

In Defense of Hawk Harrelson

[Note: It's been a heavy past couple of days, with bombings in Boston and Baghdad, and some aggravations at work.  (Nothing big, just aggravating.)  In that spirit, I thought I'd talk about baseball, my mindless indulgence for six months out of the year.]

There is perhaps no sport where the announcer matters more than baseball.  Let's face it, baseball does not translate well to television, since the screen cannot capture all that's going on around the field.  It's a slower paced game, one that's great to see in person on a long summer afternoon, but perhaps not the most exciting in an air-conditioned living room.  The announcer is crucial, because she or he gives the game life for the audience at home that can't hear the low hum of ballpark noise or smell the beer and hot dogs.

The best announcers are engaged in a kind of conversation with their audience.  They can be either erudite narrators of the game (like Vin Scully) or entertaining characters (like Harry Caray), but regardless of style, the great announcers are like an old companion you look forward to spending a few hours with.

Conversely, the worst announcers are like the bad houseguest who won't leave, or your Tea Party uncle.  You just want them to shut up and go away, but for reasons of family or friendship ties, you have to grit your teeth and bear it.  A lot of baseball fans seem to have these feelings about Ken "Hawk" Harrelson, the longtime play-by-play man for the Chicago White Sox, which also happens to be my team.

The fans I talk to can't stand his blatant homerism, exemplified by his terms "good guys" for the Sox, and "bad guys" for whoever they're playing.  They find his "you can put it on the board, yes!" signature home run call to be excessively annoying.  Even worse, they blanch at his many catchphrases and their never-ending repetition: "can of corn," "duck snort," "grab some bench" etc.  Overall, they think he's a boring announcer with multiple irritating tendencies.  Most telling, if you google him, the first website you'll find is dedicated to getting him fired.

Despite this seeming consensus regarding Harrelson's incompetence, I've spent years defending the man, and not because I am a Harrelson-sized homer.  Certainly, you could never mistake him for one of the greats, but I do think that Hawk has a nice, easy delivery, reflecting his Southern roots.  That same background is shared by greater play-callers of the past like Russ Hodges, Red Barber, and Mel Allen, who may have softened their twang, but kept the laconic country cadence so suited to baseball.  When I have a game on in the background, that's exactly what I need.  Hawk may be a homer, but hey, I'm a Sox fan, so what's not to like?

Best of all, I think one of his supposed weaknesses, his economy of words, is a strength.  There are stretches where he will leave a lot of dead air, but I'd rather an announcer not say anything than just blather on incessantly (see: Joe Buck and Tim McCarver).  With the great Steve Stone as his color man, that just leaves more room for a real scholar of the game to do his thing.

Now that I have a subscription to mlb.tv, I have heard announcers from all over the major leagues.  Many of them are good, many are just okay.  However, even the good ones have little individual character, as if they all graduated from the same baseball announcer finishing school.  Sure Hawk might be a blatant homer and prone to repeating himself, but at least he has his own individual character, unlike most sports announcers in our corporate, homogenized times.  Hawk is distinctive and different enough to annoy people, and that alone is a reason to defend the guy.

Sunday, April 14, 2013

Track of the Week: Ryan Adams and the Cardinals, "Peaceful Valley"


I've always felt that Ryan Adams was one of the most underrated and under-appreciated musical artists of our time.  I chalk much of this up to his famously erratic personality, as well as his tremendously fertile output, which included three albums (one of them a double) during 2005.  Before potential new listeners had a chance to absorb his new output, Adams hit them with something new, and often very different.  At times the quantity did not always add up to quality (29 was probably a bridge too far), but altogether he managed to write a stunning array of great songs in the oughties.  (After  long hiatus, his most recent record, Ashes and Fire, is a mellow treat.)  Although Adams started out in the alt-country world with the seminal Whiskeytown, albums like Love is Hell strayed far from the roots music road.

I think he managed to expertly balance his country roots, openly professed love of Morrissey and the Smiths, and punk rock feel on his 2005 double album with the Cardinals, Cold Roses.  It rocks, twangs, and aches in equal measure.  However, I think he came closest to perfection on a track on the following record, the honky-tonk drenched Jacksonville City Nights.  That song is "Peaceful Valley," a longing for heaven as a release from a cruel and frustrating world.  As song about the afterlife, it's about ten times as profound as all of the vacuous "praise" music put together.  There's a real edge to this song, wondering if heaven means giving up on the good things in life (like wine, cigarettes and coffee).  Beyond that, it turns hope for the afterlife into a kind of dark death wish to be rid of this wretched earthly existence.  Few songwriters out there can tackle material like this and make the listener believe it.


Friday, April 12, 2013

Do I Have a Moral Obligation to Stop Writing Grad School Recommendations?

Hot on the heels of Rebecca Schuman's blistering Slate article on the perils of graduate school, Sarah Kendzior has penned a piece about the exploitation of adjuncts for Al-Jazeera that Cranky Bear would heartily approve of.  These articles have provoked outpourings of bitterness and approval from my friends with experience in the academic world, and for good reason.  They well articulate the feeling of betrayal that many of us have.

Yet today, in the midst of all of this angst, I was talking with a former student over email about his possible grad school plans.  His one option is a terminal master's program at a very prestigious university (the same program at the same university that I graduated from), and he's wondering whether to go or not.  I have tried hard to expose him to the realities of the academic job market, but he is a very passionate and idealistic young man.  Like me once upon a time, he is the type of person willing to sacrifice his youth and quality of life to pursue the scholarly dream.  Unlike my younger self, however, he went straight from college into the work world, and now has a steady, well-paying job in the business world, and gets to live in Austin, Texas, to boot.

I have told him that he should not jettison his current life for a shot at the academic ring unless he really, really wants to do it.  I have told him the odds, but like Han Solo in an asteroid field, it's still full speed ahead.  I don't know what else I can do, apart from refusing to write letters of recommendation, an option that's off the table because I already wrote glowing letters for him.

As much as I tried to dissuade him from applying, I also know in my heart of hearts that he's the by far more suited to being a scholar than any other student I've had.  His knowledge and theoretical understandings are warped by growing up in a repressive, rural Texas environment and attending a third-rate university full of traditionalists, but I know that grad school would easily set him right.  I look at someone like this with his ability and drive, and think that it is an insane profession that would not welcome someone so suited for it.

I then remind myself that it is in fact an insane profession.  For instance, I know multiple people who've published books with reputable presses on important topics who can't get tenure track jobs.  I know many more who are fine scholars and stellar teachers who toil as low-wage contingent faculty, or who are on the tenure track at schools that pay little and offer a pittance, if anything, for research funding.  At the same time, I know complete and utter mediocrities with tenure.  I have seen someone get promoted to full professor, apparently on the strength of his having been a crappy department chair.  The more I think about it, the more I realize that it is these people who keep the academic dream alive.  The youngsters see such mediocrity and think "if they can make it, surely I can too."

It's becoming obvious to me that just telling my former students to reconsider graduate school isn't stopping them.  They don't even listen to the advice I give them on how to play the academic game.  I know other former students at second tier doctoral programs who are worried about their career prospects, but who ignored my advice to jump to more prestigious universities after completing their master's degrees.  These two students are very intelligent and highly capable, and will most likely toil for close to a decade to get a degree from a school whose appearance on their CV will mean an automatic toss into the discard pile come job application time.

I also told them that grad school was not the way to go in the first place, but they obviously didn't listen.  Of course, neither did I at their age.  I heard the horror stories, and thought to myself "I'll make sure that won't happen to me" with all the arrogance and stupidity of youth.  It is becoming increasingly clear to me that the only way I can do my part to preventing the next generation of youth from being sacrificed to the Moloch of graduate school is to refuse to write anyone a letter.  Deep down I know I'm not capable of such an extreme move, but I am beginning to think that this reflects an incurable sentimentality and optimism on my part, not the best interests of my students, or my moral obligations to them.

Wednesday, April 10, 2013

Margaret Thatcher Embodied the Fundamental Contradiction of Modern Conservatism

Margaret Thatcher's rise to power, and the changes she initiated (some would say inflicted) represented the fearsome power of a revived conservatism.  In all of the obits, condemnations, and hagiographies her death has inspired this week, no one disputes her importance for putting conservative thought into action, no matter what one thinks of those ideas.

However, few seem to be asking questions about the fundamentals of her ideas, which is what I would like to do.  Although Thatcher was a conservative, she and other modern conservatives seem to be at odds with themselves.  To be a conservative means literally to "conserve" past traditions and political practices.  Despite this fact, today's conservatives worship a power that is the greatest destroyer of traditional life ever unleashed: laissez-faire capitalism.

In Thatcher's worldview, there was famously no such thing as society, only individuals and families.  Despite her admonitions that Britons revive "Victorian values," Thatcherism resulted in something quite different.  Her "every person for themselves and let the devil take the hindmost" philosophy hardly jives with anything the Victorians, for all their sins, would have valued.  

Even though Thatcher couched her radical ideas in the language of tradition and Britishness, when push came to shove, her reign shattered tradition and destroyed the organic ligaments of community, the things that conservative thinkers like Edmund Burke had always praised.  In some cases community destruction was quite literal, as in her government's intentional destruction of the coal mining industry, which wrecked the towns dependent on it.  In her worldview, cash-value and the bottom line mattered more than anything else.

Thatcher thus personified the fundamental contradiction at the heart of modern conservatism.  The same people who act most worried about social change and the dissolution of society are the same people worshipping the the very force -unfettered capitalism- most responsible for dissolving traditional bonds. 

Monday, April 8, 2013

How Professional Historians and Professional Baseball Players Have Similar Career Prospects



There's a piece at Slate that's making the rounds among my academic friends imploring potential graduate students in the humanities to go running for the hills.  Of course, this is a long-standing genre, but this particular piece, written by a recent grad, speaks to the frustrations of those stuck in the trenches of adjunct labor.

Of all things, it got me thinking about baseball.  Some professions try to limit the number of apprentices they take in, but others, like history and baseball, will give just about anybody a shot.  Those who vie for tenure track positions and a spot in a major league lineup must compete against hundreds of others with the same goal.  Both grad students and minor leaguers are paid low wages but subsist on the dream that they may someday get to make a living out of what gives them the most joy.

The winnowing process in baseball as in academia is brutal.  Only a tiny fraction of amateur baseball players are signed to minor league contracts.  The vast majority of those players end up being career minor leaguers, and many never even make it past rookie ball.  Of those who do make it to the majors, many are gone after a couple of seasons.  In academia potential historians must get accepted to a doctoral program, make it through classes, pass comprehensive exams, research their dissertation and then write the damn thing.  When it's all said and done, most of the students in any particular graduate cohort don't make it to the end.  Those who do graduate must compete in a Thunderdome-esque job market and are most likely to toil in the academic minor leagues as adjunct or "visiting" professors.  Those who do manage to attain tenure-track jobs still must jump through the brutal tenure obstacle course.

Academics will do this at institutions they consider second-rate in towns where they'd rather not live.  Like baseball players, academic historians work where they are assigned by the school that drafts them. Like baseball players, they can jump to another school as a high-priced free agent, but only if they are star material.  If not, they're stuck.

Former minor leaguers and unemployed PhDs finish their failed stab at the dream having spent their young earning years developing skills that have little to no attraction to employers outside of their chosen fields.  No matter what the cheerleaders for the "alternate-academic" path say, there is very little demand for an expert in nineteenth century German social history outside of a university.

Look, we all can't be baseball players, rock stars, or CEOs of major corporations.  The problem is, we pretend that becoming a tenure-track professor at a desirable university isn't just as unlikely as getting to start in the major leagues.  If young people still want to take the chances I took, I'm fine with that, I just want them to know the real story when they make that fateful decision.

Saturday, April 6, 2013

Our Current Political Divide May Actually Be a Regional Divide



During the last seven years of my life I have lived in four different states in different parts of the country, and that experience has revealed to me the stark regional differences that exist in this country.  The conventional wisdom about important issues is wildly different here in New Jersey as compared to Texas, the state I just left.  We can see such differences in the recent pushes in states like Arkansas and North Dakota to severely restrict access to abortion when no such movement exists outside of the South or West.  A better example might be the map (displayed above) showing where people were most likely to change their Facebook profile pictures in solidarity with same sex marriage activists two weeks ago during arguments in front of the Supreme Court.  The South and Great Plains are noticeably white on this map, and out of step with the rest of the country.  If you look at the map of the 2012 election below, you will see a great deal of correlation.



Shifts in regional power can make a huge difference in national politics.  As historian Bruce Schulman has argued, the rise of conservatism in the 1970s was driven and reflected by the economic and population growth of the Sun Belt.  Unfortunately for conservatives, Sun Belt states like Colorado and California have become more liberal, and Florida has gone for Democrats in the last two presidential elections.  The South and interior West alone cannot give Republicans the votes they need to win.

The regional disparities in voting and on key issues also might help explain the virulence and polarizing nature of political debate in America. The hard-core, true believer House Republicans hail from districts where people who disagree with their ideas are few.  These same ideas are political poison in other parts of the country.   Many states in the conservative regions have taken a combative attitude towards the federal government by refusing to take part in the Affordable Care Act and passing nullification resolutions.  Because of the ideological uniformity of many of these states, politicians there are free to be as kooky and extreme as they want to be.

There is a world of difference between Connecticut, which recently passed sweeping new restrictions on gun ownership, and states like Montana, where the legislature wants to nullify any new federal gun laws.  I really don't think this divide can be breached; fanatics never compromise.  Because our system was designed to diffuse power, it makes it easy for one chunk of the country to band together and jam up the legislative gears.  Mechanisms like the filibuster, for instance, give minority factions an effective veto.

This is not the first time in our nation's history that regionalism has fed political conflict.  (There was that whole Civil War thing, you know.)  For decades the Solid South banded together in Washington to prevent any moves towards racial equality.  Supporters of segregation filibustered and nullified to their heart's content, successfully blocking progress for quite a long time.  If we are to make any progress regarding gun control, reproductive rights, marriage equality, and a whole host of other issues, this regional conflict must be resolved.  Hopefully it will be settled with less difficulty than in the 1860s.

Thursday, April 4, 2013

An Appreciation of Roger Ebert

I grew up in an isolated town where the only movie theaters were a three-plex at the mall, and a twin cinema downtown.  The proprietors of these establishments typically played it safe with the movies they brought in, only showing the most middle of the road Hollywood fare.  Despite this impediment I started on my cinephile journey in high school.  Much of this has to be credited to a fantastic local video store that stocked just about everything.  However, I would not known what to look for if not for Roger Ebert.

As a teenager, I watched Siskel & Ebert every week.  At first they really intimidated me, because they could be brutally critical with stuff they didn't like, and they heaped praise on movies I'd never heard of, and which didn't come within 100 miles of being exhibited in my hometown.  I slowly began to realize that I had spent my life watching movies uncritically, without reflection, and had wasted much of my time on garbage.  They mentioned worthier films like Apocalypse Now, Raging Bull, and Taxi Driver, and I sought them out.  After that and a dose of Stanley Kubrick, I never looked back.

Once I took off for college in a city with a thriving art house cinema, I finally got to enjoy independent and foreign movies on the big screen.  Watching quality films became one of my most cherished hobbies, and Ebert was my guide.  With the advent of the internet, I especially benefited from his Great Movies series, which gave me ideas and pointers on where to look in the video store.  After college I landed in Chicago (and later Champaign-Urbana), and I made sure to buy a copy of the Sun-Times every Friday to read his reviews, which guided me to all kinds of films I never would have seen otherwise.  His passionate opinions and nimble prose drew me in week after week and never failed to give me insight.  With his passing we have lost a true giant, one who will not be replaced in our current, fragmentary cultural landscape.

Wednesday, April 3, 2013

Track of the Week: Mott the Hoople, "The Ballad of Mott the Hoople"


Over the years I've developed a special love for British bands with a sizable following in the home isles that didn't make a splash over here in the states: Suede, The Small Faces, The Move, and now Mott the Hoople. I first got into Mott a few years ago after a record store clerk back in Illinois pushed their All the Young Dudes album on me. In the ensuing years I've picked up more albums on vinyl, and enjoyed rockers like "Roll Away the Stone" and "All the Way from Memphis."  Their killer cover of the Velvet Underground's "Sweet Jane" speaks of the magic they could work with a good riff.

It's one of their ballads, however, that has really sunk its hooks into me. Around the time I started teaching high school after my escape from academia, I dusted off their Mott album, and for the first time really listened to the wistful track "The Ballad of Mott the Hoople (26th March 1972, Zurich)."  It's a heart-wrenching account of what it's like when you've realized that the dream you've chased for so long and for so hard isn't going to come true. The whole premise is rather meta: before taking the stage at yet another show, lead singer Ian Hunter opines in his distinctive rasp about never having quite made it. (The Allmusic site describes the band as "one of the great also-rans in the history of rock & roll.") One of the most cutting lines of the whole song is "rock and roll's a loser's game," and it's really stuck with me. When I hear the song, I tend to substitute "academia" for "rock and roll," and think about my own also-ran status.

Like rock and roll, academia attracts true believers willing to endure poverty and all manner of indignities for the opportunity to be on the stage, whether it be at Madison Square Garden or an Ivy League lecture hall. The sheer numbers mean the odds are slim, and the number of failures is rather steep. For every tenured professor there are ten adjuncts and grad school dropouts; for every Bad Company there is a Mott the Hoople and several other bands too obscure to even get a record deal. And yet Bad Company covered one of Mott's songs ("Ready for Love") without even approaching the quality of the original. There's the rub, of course: many also-rans are a whole helluva lot better than a good number of their more successful peers. I know plenty of academic Mott the Hooples who have a lot more to offer the world than the professorial Bad Companys and Foreigners that I've had the misfortune to meet and even work with. (Like those bands' music they tend to be overblown and devoid of taste, but inexplicably popular.)

The problem with big dreams is that they come with big disappointments. The most cutting line in "The Ballad of Mott the Hoople" for me is "Oh I wish I never wanted then/ What I want now twice as much." I too have felt the regret that I was ever so stupid and silly to chase such a preposterous, impossible dream. Or to let that dream dominate my life and steal my youth. My undergraduate professors warned me, and told me how long the odds were, but I refused to listen.  How dare they presume to tear down my dream?

Our society tends to romanticize the pursuit of dreams and the following of one's bliss, but it's taken me a great deal of time and meditation to overcome my bitterness with the loser's game that I used to play. I've recovered, and I'm happier than I've ever been, but my years of having dreams are over and done. Perhaps that's for the best.

[Note: I wrote a version of this back in the autumn of 2011, and I pretty much feel the same way still.]

Monday, April 1, 2013

What If...I Were Commissioner of Baseball?

It's baseball's opening day today, and amidst my joy at the return of the boys of summer, I am annoyed at the men who run the game.  Last night's official season opener between the Rangers and Astros was case in point.  The Astros have been moved from the National to the American League, completely against the wishes of the team's fan base, so that interleague games can be scheduled every day of the season.  I've never heard a solitary baseball fan demand that interleague play be a daily occurrence, so I am unsure where this idea came from.  It is representative of Bud Selig's tenure as commissioner, which has been full of all kinds of silly gimmicks and hasty decisions.  I know that Bud will be retiring soon, and baseball will have a new commissioner.  He (baseball's hierarchy is unfailingly patriarchal) will likely be the owners' toady and enable, just like Selig and most of his predecessors.  If he is not, he will be strong-armed and fired by the owners, which was Fay Vincent's fate.  That doesn't mean a baseball fan can't dream or have flights fancy.  If your humble author were made commissioner of baseball, this is what he would do:

Organize the two leagues into two divisions and eliminate the wild card
One advantage that baseball has over most other sports in the pennant race.  The excitement of two or more teams trying to win every day down the stretch, or else face elimination, is hard to match.  The last day of the 2011 season, which saw several momentous games on one night, contains the kind of magic you just don't find elsewhere.  Pennant races are good for baseball because they happen just as football season is beginning, and thus keeps the public spotlight on the diamond at a time when the more popular gridiron begins to hold sway.  Unfortunately, the current format, with three divisions, has cut down on the number of close pennant races.  Having teams compete for only two spots in each league will create dramatic moments.

What's more, the current wild card system is immoral.  In sport with 162 games over a grueling season, I find it completely galling that a team incapable of winning its own division can go to the playoffs and ride a hot streak to a title.  One of baseball's advantages over other pro sports is that the regular season really matters, and that mediocre teams can't make the playoffs, as they regularly do in hockey and basketball.

Bring the Astros back the to National League and limit interleague play
The only reason the Astros are in the American League is to expand interleague play, which should actually be limited.  In the early days interleague games had all kinds of excitement attached to them, since they had never happened before.  Now they have gotten beyond mundane.  The Reds and Angels are playing an interleague series tonight, but does that really excite anybody?  One of the advantages that baseball has over other sports is the uniqueness and contrasts between its leagues, which are so much more meaningful than those between the NFC and AFC in the NFL.  I think limited interleague play is fine, especially when it involves intense, intracity or intrastate matchups like Cubs-White Sox.  However, once it waters down the uniqueness of the leagues without adding any real value to the baseball experience, it ought to be limited.

Recommend Buck O' Neil, Bill James, Marvin Miller, and Curt Flood for the Hall of Fame
I know the writers and the Veterans Committee determine these things, but as commish I would implore them to consider these men for the Hall.  O'Neill was one of the greatest managers in the Negro Leagues, as well as a legendary scout and coach in the majors.  After he retired, he became one the game's greatest ambassadors.  If not for segregation of the majors and the racism that remained after desegregation, O'Neill would have managed in the bigs and managed well.  The Hall should do right by him.  Bill James' use of statistical analysis has revolutionized the way the game is interpreted, and so ought to be inducted.  Marvin Miller and Curt Flood were the most important figures in the end of the reserve clause and the battle for players to get rights and just compensation.  The owners may have hated both of them, but they don't get to determine who's in the Hall.

Have an official ceremony at Elysian Fields in Hoboken
The legend of Abner Doubleday inventing baseball in 1839 in Cooperstown, New York, is complete and utter poppycock.  Modern baseball was invented by New Yorkers who played their games at Elysian Fields, in Hoboken.  Today it is a busy street corner, with a humble mark that shows the spot where baseball was first played.  As commissioner I would officially give Hoboken its due and repudiate the Cooperstown legend, and not just because I'm a New Jersey guy.  There is a reactionary and pernicious notion that baseball is a rural game and a throwback to a more rustic America, when in fact it was the product of the city and early urbanization.  The ruralification of baseball fits with a general anti-urban bias in this nation's mainstream culture that has pernicious effects on our cities and their dwellers.  It's also just bad to perpetuate lies.

Reform the All-Star game
I wrote about this in-depth last year, so I will just give the highlights: end the player from every team requirement, limit rosters to 27 players, and end the practice of giving the winning league home-field advantage in the World Series.

Revenue sharing
This would really be a long shot.  Of all the major team sports, baseball is the only one without a robust revenue sharing structure when it comes to television money.  This means that wealthy teams can outspend poor ones, which hurts the overall popularity of the sport if fans in several markets decide that there team will never have a chance, and so give up.  Revenue sharing would also help limit runaway salary inflation.

Use YouTube more effectively
Right now MLB tries to shut down any YouTube video with footage from baseball games in it, ostensibly to guide users to mlb.com.  There's just one problem: unless you're looking for highlights of recent games, it sucks.  Baseball has such a wonderful history, and plenty of video footage of it is out there, but fans have almost no access to it at a time when other sports have come to peace with YouTube.  If the sport wants to bring in young fans, keeping them from seeing anything baseball-related on YouTube is a pretty stupid way to go about it.