Sunday, December 28, 2014

Five World War II Myths That Are Complete B.S.

A quick round-up of a few mistruths routinely bandied about regarding W.W.II




Now, I know what you're thinking. "Hey, a listicle outlining erroneous assumptions about World War II ... how original!" 

Perhaps this is indeed territory oft trudged by lesser websites, but I figured there were still falsehoods a plenty circling World War II that really haven't been addressed by that many web articles, and for some reason or another, haven't been given the glorious public debunkings they really deserve.

Think you have a firm grasp of why Hitler never invaded Switzerland, or why the Japanese never tried to carpet bomb Nebraska? Well, think again, Holmes ... as it turns out, our mutual misunderstanding of what went on during the Greatest War of 'Em All is enough to fill several history books.

Myth Number One:
Germany never invaded Switzerland because they feared their armed citizenry.

While the Wehrmacht made their best efforts to completely flatten continental Europe into a pancake of Aryan subjugation, a few countries remained unoccupied by Hitler’s forces. Spain and Portugal were spared Nazi wrath, and try as he may, old Adolf never could mount a full out assault on Great Britain. Furthermore, quite a few countries, including Denmark and Finland, actually fared pretty well against German invasions, with Norway having the proud distinction of kicking Stalin AND Hitler’s ass at various points throughout World War II.

Considering the geographic proximity to Germany, some historians and astute observers have wondered as to why Hitler never attempted to annex, invade or attack Switzerland -- especially since A.H. had been spewing some vitriol about the Swiss ever since the French collapsed.

One of the popular narratives is that Hitler never made an advance against the Swiss because they had a heavily armed citizenry. According to some guy on the PBS website, the Swiss public were packing heat like rappers, and ready to pop a cap in sundry Nazi asses as soon as they goose stepped their way into the country.

Indeed, Hitler had plans at one point to mount an invasion against Switzerland -- he even had as many as 500,000 Italian and German troops ready to pounce as soon as he gave the signal. Alas, Hitler never gave the go-ahead, a military maneuver that, to this very day, inspires great debate among armchair generals.

While the NRA crowds love to extol Switzerland as an example of how an armed public prevents fascism, the reality is that Switzerland’s rifle-toting homeowners probably had little-to-nothing to do with Hitler’s about-face on invasion.

To begin, let’s mull a little logic here. The Swiss had a fairly sizable armed militia, to be sure, but you know who had even more guns than they did? THE SOVIET UNION, WHICH HITLER INVADED TWO YEARS AFTER ABANDONING OPERATION TANNENBAUM. To say the Hitler was too much of a pussy to go after Switzerland when he then took on the most massive army in modern history just months later kind of lays waste to the whole hoplophobia argument. If Hitler was afraid his troops would have been shot, methinks he probably wouldn’t have sent his own boys into the meat grinder against Stalin’s considerably more powerful artillery units.

Furthermore, the invasion of Switzerland served no real strategic importance to the Nazis. In case you didn’t know, Switzerland is basically a giant landlocked salad, buffered all the way around by rocky terrain. With no waterway access, its geographic value in the middle of a freaking war was practically nil. Why would Hitler want to squander the time and manpower and put soldiers at risk to take over something that wouldn’t serve as a launching pad to the eastern or western theaters, precisely?

Lastly, an interesting hypothesis is that the Nazis needed a neutral Switzerland as their wartime piggy bank. Say, ever wonder what happened to all of the plundered gold the S.S. stole from occupied territories? Well, odds are, it wound up in a Swiss bank somewhere -- demonstrating the financial importance of Switzerland as a safe deposit box for the Krauts, even the motherfucking royalties from “Mein Kampf” were put inside a Swiss account. With that in mind, perhaps there’s an entirely different reason why the Germans never put boots on the ground in Switzerland; why the hell would they want to rob themselves?

Myth Number Two:
The Japanese never invaded the U.S. homeland because the public was well-armed.

It’s the same argument as with the Switzerland situation, only intensified a million times because we’re talking about ‘MURICA, dad-burn-it.

So, in 1941, the Japanese attack Pearl Harbor. We don’t think that shit’s cool, so we vote to go to war and show Hirohito and his troops what-for. While a huge component of raising war support in the States centered around fears of a second Japanese attack, the Imperialists never made a full-scale attack on the U.S. homefront. According to oh-so-ardent gun advocates, that’s because the U.S., much like the Swiss, had their shotguns a ‘ready in case Tojo came a knocking on any doors in the Heartland.

Unfortunately, there are so many things wrong with that little notion that I’m not quite sure where to begin. Perhaps we can start with the fact that Japan actually did attack the U.S. on its home turf quite a few times after Pearl Harbor, albeit in really poorly-planned subterfuge missions that sound like something out of a Chuck Norris movie.

Secondly, I don’t know how many of you were aware of this, but between the U.S. West Coast and Japan, there’s this thing called “The Pacific Ocean.” It’s really big, and it would take a long time for Japanese attackers to make it to Los Angeles -- if they could even make it to San Francisco before running out of gas altogether. For any kind of major (and non-submarine-launched) aerial or naval attack on U.S. soil to have transpired, the Japanese would have needed some sort of mid-Pacific launching pad, which would have been easily picked up by U.S. radar. If a Japanese armada were coming, U.S. military would have had not just hours, but days of advance notice. As a general rule, that usually doesn't bode well for a sneak attack's success rate.

Additionally, the primary goal for Japan during World War II was to maintain China, not try to dick around in California. With the Japanese Navy playing defense to the east -- and most of Japanese ground forces committing unspeakable war crimes in southeast Asia -- Japan simply didn’t have the manpower to mount any kind of invasion against the U.S., even if they wanted to. It’s a nice idea to think rifle-toting potato farmers were the only thing keeping us from getting gobbled up by the Japs, but unfortunately, that pesky reality speaks to the contrary.

Oh, and that "quote" up top from Yamamoto? Not that this is a shock or anything, but it's completely made up bullshit.

Myth Number Three:
The United States had no choice but drop the atom bombs on Japan.

We’ve all heard the narrative a million times: the U.S., having crippled the Japanese navy, now found themselves facing a long, grueling ground battle against the Imperialists. With resources diminishing, the Japanese were ready to fight a suicidal battle against occupying troops, with some early estimates tabbing as many as 1.7 million casualties -- with as many as 800,000 U.S. fatalities -- in a full-fledged invasion of the Japanese mainland.

Without question, Operation Downfall would have proven costly. In anticipation of heavy U.S. casualties, the military manufactured a surplus of Purple Hearts, which are still being handed out to injured troops today.

The unfathomably high death toll, we have all been told, is the primary reason why Truman went ahead with the decision to bomb Japan. The thing is, by the time the U.S. was mulling dropping the bomb, the Japanese military was pretty much already defeated. The Battle of Midway had more or less eradicated the Japanese Navy, and the nation's infrastructure had already been reduced to rubble and ashes thanks to daily air raids. The fire bombing campaign of 1944, it is perhaps worth noting, resulted in a higher body count than the bombing of Nagasaki and Hiroshima combined.

Even Gen. Douglas MacArthur thought the Japs were ready to hang 'em up -- through blockades and continual bombings, he figured Hirohito would've surrendered within half a years time, anyway. In July 1945, Gen. Robert Eichelberger issued a similar sentiment, revealing that most of the U.S. military brass were anticipating a Japanese surrender much sooner rather than later.

And even if a mainland invasion were to occur, it's not like the U.S. would have gone into battle all by their lonesome -- but more on that in just a bit.

A popular hypothesis that has arisen over the last 50 years has been the possibility that Truman authorized the bombing of Japan not to goad the Imperialists into surrendering ASAP, but rather, to scare the ever-loving dog shit out of the Soviets, who were clearly the other big victor coming out of World War II.

Ultimately, the real reason why the U.S went ahead with the bombings may have had little to do with ending the Pacific war as soon as possible, and a whole hell of a lot more with justifying the costs of the Manhattan Project. After spending more than $2 billion on developing the bombs, which is well over $25 billion in 2014 dollars, who is to say there wasn't an eagerness to reap the fruits of all of that secretive -- and pants-pissingly-horrific -- R&D?


Myth Number Four:
The atom bombing of Japan was the reason why Japan surrendered.

Looking at the chronology of World War II, it seems about the most obvious thing in the world. Nagasaki and Hiroshima both get a nice big dose of radioactive death in August 1945, and what do you know, Hirohito surrenders just a few days later. It’s about the simplest example of cause and effect there is, no?

Well, as it turns out, the bombings really weren't the likeliest reason why the Japanese finally surrendered. After the bombing of Hiroshima, Navy Admiral Soemu Toyoda said the remaining military was ready to fight through subsequent atom bombings, and even after the bombing of Nagasaki, the Emperor wasn't quite yet ready to throw in the towel. U.S. military leadership were actively mulling the possibility of dropping as many as seven more atom bombs on Japanese targets before November.

Clearly, the atom bombings alone weren't the final domino that lead to Japan's unconditional surrender to the Allies. The same date Nagasaki was bombed, something else happened, which probably had a greater influence on the Japanese leadership's decision to finally call it quits.

In 1941, the Russians and Japanese signed a neutrality pact. It did pretty much what it sounded like, insuring that no matter what happened in WWII, the two countries would never attack each other. This is pretty important, because the two nations had been unofficially warring over Asian territory since the early 1930s.

On Aug. 5, 1945, the Ruskies said fuck the pact, insinuating the Japanese had violated the agreement. This, interestingly enough, occurred just 90 days after the Yalta Conference, in which the Russians agreed to enter the Pacific War on behalf of the Allies following Germany's surrender.

And on Aug. 9, the Soviets formally declared war on the Japanese by invading Manchuria and royally fucking things up. Over the course of just a few days, the Soviets slew upwards of 80,000 Japanese troops and captured more than half a million prisoners -- the absolute worst land defeat Japan experienced in its entire, blood-soaked military history.

With a decimated military and air force -- and a large throng of army personnel stationed in occupied territory outside of Japan -- Hirohito, already on the verge of complete bankruptcy, was now starring down dual-invasion from the world's most powerful and technologically advanced nations. Nuclear obliteration wasn't enough to get the Japanese Supreme Council to finally mull capitulation, but the thought of Stalin and his boys plowing their way through China en route to a possible land invasion of Kyushu was.

While Hirohito listed the atom bombings as a reason why Japan surrendered in his famed public address on Aug. 14, he also made reference to the Soviet invasion when addressing the remnant of the Japanese military on Aug. 17:

"Now that the Soviet Union has entered the war against us, to continue … under the present conditions at home and abroad would only recklessly incur even more damage to ourselves and result in endangering the very foundation of the empire’s existence. Therefore, even though enormous fighting spirit still exists in the imperial navy and army, I am going to make peace with the United States, Britain, and the Soviet Union, as well as with Chungking, in order to maintain our glorious kokutai."

Myth Number Five:
The Holocaust was the worst genocide of World War II. 

A whole lot of Jewish people died in World War II -- as many as 6 million, actually. While many historically-naïve folks consider that to be the absolute worst ethnic cleansing episode of the 1900s, it’s far from it. In fact, it’s not even the greatest Nazi-perpetrated genocide, in total victims, to happen during WWII.

Depending on who you ask, a grand total of 50 to 80 million people died during World War II. It's pretty hard getting an estimate for how many were killed by each side of the Axis and Allies, but most historians place the range of Nazi-instigated killings around the 20 million corpse mark as a conservative estimate. At least 8.8 million of those killed by Hitler's forces were Soviet troops -- a number that jumps up to almost 12 million when you factor in the three million plus Ukrainians who were also mass murdered by German occupiers. No matter how you look at it, Hitler easily killed twice as many Slavs as he did Jews, but for some reason, no one ever really reflects on that when discussing Nazi atrocities.

Along the same lines, you really have to wonder why nobody knows what Holodomor was, either, since the one-year Ukrainian death toll there was almost certainly as high as the Jewish body count from all of World War II -- if not considerably higher. And for god's sake, whatever you do, don't bring up the fact that 2 million Germans died in "forced expulsions" from 1944 to 1950. Seriously, don't even think about it.

And that actually pales in comparison to the mayhem wrought by Imperial Japan. Really, the ultimate crimes against humanity victims of WWII were the Chinese, as Japanese occupiers slew approximately TWENTY MOTHERFUCKING MILLION of them -- more than three times the number of Jews killed by the Nazis. Oh, and that's not counting the additional 4 million in the Dutch East Indies, 2 million in Vietnam, and 1 million in the Philippines they also exterminated, plus another million or so scattered throughout Korea, Malaysia and Oceania. I still have trouble grasping why, precisely, that Nazi Germany is considered the most evil empire in history, when the Japanese killed far more people, across a wider range of territory, in ways that are almost inconceivably barbaric, for a much longer period of time.

Of course, none of this is to say that the Jewish Holocaust wasn't a horrible moment in history. It was indelibly tragic, but the frank reality? It was far from aberrational, and unquestionably, nowhere close to being the largest genocide of the war.

And compared to post-World War II genocide numbers, the Holocaust is positively dwarfed in terms of total victims. Joey Stalin likely killed more of his own people after World War II then were actually killed during the conflict -- and for a real mind-blower, the policies of Chairman Mao were likely responsible for the deaths of anywhere from 49 to 78 MILLION.

Tuesday, December 23, 2014

Holiday Sprinkles Cookie Crisp!

From the people who brought you Boo Berry and Count Chocula, it's an all-new, limited-edition seasonal cereal that might just set a new standard for marketing laziness!


This year, more than any year in recent memory, I haven't really felt myself experiencing the "holiday spirit," so to speak. With so much stuff going on in my life outside of this here blog, I rarely have time to reflect on the ephemera around me -- seriously, as fast as 2014 went, I'm still wondering when and if we even had an autumn this year.

In an effort to rouse just the teeniest modicum of Yuletide splendor before the holiday's over and done with, I decided to take a trek down the aisles of the nearest big box store recently, in hot pursuit of any outlandish seasonal items. Much to my dismay, there was precious little new out there, outside of a few brand variations of the same-old same-old ... sorry, Mars, Inc., but I'm not shelling out another five bucks on a bag of Snickers just 'cause they're shaped like Santa heads.

Just when I was about to settle on a can of cherry cordial chocolate frosting and a discounted package of Pumpkin Spice flavored Oreos, this little doosie caught my eye...


...HOLIDAY SPRINKLES COOKIE CRISP! Right off the bat, there are at least three things about the packaging that make me giddy. Let's make ourselves an itemized list, why don't we?

NUMBER ONE: My goodness, is the box art reserved. A lot of times, limited-time-only holiday cereals are decked out in really loud and garish colors -- por exemple, Christmas Crunch. However, the packaging for Holiday Sprinkles Cookie Crisp is almost completely white, albeit with a smidge of faint emerald at the top and bottom corners of the box. Even the blunt holiday ornamentation, like the Christmas lights and the big red decoration, scream "generic foodstuff" more than "yuletide eatin'. And how about that gigantic General Mills logo in the top left hand corner? The entire thing just bellows "1970s marketing," and also "I don't really give that much of a shit."

NUMBER TWO: The Cookie Crisp wolf (canonically, his name is Chip) looks so jaded. Sure, sure, he's got that corporate-formulated child-corrupting gaze in his eyes, but at the same time, he just appears so uninterested in the giant bowl of cookie-flavored cereal in front of him. It's almost like he's trying to avoid scooping up one of the chunks, which contextually, would be about the size of his head. The entire thing just rings hollow, like he's posing for a half-hearted photo and can't wait to take off that stupid Santa cap.

NUMBER THREE: The cereal itself. Goddamn, is that one of the lamest attempts to cash in on the Christmas season I've ever seen, and I assure you, I have seen many. It's just normal Cookie Crisp, only with these little piddly specks of green and red sugar flakes on them. The fact that the chunks look like they are infected with scabies and pockmarked with boogers on the box art really doesn't help as a selling point, either.


As a note of interest (as opposed to a note of disinterest, I suppose), General Mills also shills for their complementary seasonal product, Sugar Cookie Toast Crunch, on one of the box side flaps. Truthfully, if I would have seen these on store shelves, I probably would've picked them up instead, because not only do I find the regular Toast Crunch product to be generally yummier than Cookie Crisp, they also had a WAY better holiday aesthetic. I mean, look at that little anthropomorphic cereal chip -- he (or she) is just goddamned adorable in that little elf hat. Alas, the local grocer didn't carry it, and I had to make do with what is certainly an inferior limited-time-only cereal product.


That's not to say everything about the cereal was lackluster, though. The back packaging of the box had a whole bunch of activities for the grade school set on it, including a couple of "spot the item" laundry lists. Of course, my favorite aspect of the art is how it appears that Chip the Wolf is sneaking his way into an orphanage to commit God knows what kind of unspeakable deeds. Also, I am really curious about the ethnic composition of this family -- especially that Indian/Arabian/Nigerian/Puerto Rican kid up top, who clearly has the breaking and entering canine within his field of vision. Way to dick around with that baseball mitt while the whole goddamn house gets looted, amigo.


As for the cereal itself, I wasn't too impressed. I've never been a big fan of Cookie Crisp, as a whole -- the stuff is practically vanilla wafers you pour milk over, which is all sorts of gross. As the name implies, the recipe hasn't been tweaked for the holiday offering, which is more or less the exact same product you'd find on store shelves all year-round, albeit with red and green shit sprinkled on the chunks. Not since Chips Ahoy's "Halloween" cookies have I encountered a seasonally-branded product with this much ennui for a holiday. Shit, if I didn't know any better, I'd think Chip was of the Hebrew persuasion -- how else do you explain a Christmas tie-in this effortless?


And there's our cereal, up-close and impersonal and also just a little blurry. Sadly, these things just don't photo that well -- not every breakfast product can be as as aesthetically pleasing as Pop-Tarts, you know.

In case you were wondering, the sprinkles didn't really add much variety to the cereal, which as stated previously, has a very vanilla (literally) taste and texture. Basically, the foodstuff is sugar speckled with more sugar, which to be fair, does have some appeal -- especially if you're an eight-year-old processed food junkie or a nearly 30-year-old man child who likes to nom on kid-branded breakfast products by the handful after a tiring day of work/redditing.

The cereal isn't necessarily bad-tasting, it's just bland and uninteresting, with a holiday hook that's nothing at all to get excited about. As far as seasonal breakfast offerings go, Holiday Sprinkles Cookie Crisp is like that one really shittily decorated house in the neighborhood, with the wire reindeer and the blue and yellow lights just swung around the trees in no real discernible pattern. Yeah, the theme is there, but it's really, really half-assed: this may not be a terrible product overall, but as a special holiday offering, it's about as lackluster as that one "Frosty the Snowman" cartoon featuring the voice of John Goodman. Man, that one sucked, and hard.

SPECIAL BONUS GOOD-TIME EPHEMERAL EXTRA FEATURE CONTENT!


GREEN COCA-COLA!

You know, I would be remiss if I didn't at least say a thing or two about this newfangled Coca Cola beverage. Yeah, the official nomenclature is "Coke Life," but the working man just calls it "Green Coke," for obvious reasons.

For the uninitiated, "Coke Life" is a new, health-conscious (sorta) product from the world's numero uno soda manufacturer/promoter of diabetes. The big hook here is that instead of being sweetened with corn syrup -- a product that is at least partially responsible for America's global economic domination over the last 75 years -- it's brewed with natural cane sugar (alike the cult favorite "Mexican Coke") and Stevia, which I think is some kind of artificial sweetener substance that we're probably going to find out kills people in a two or three years.

Unfortunately, the soda itself isn't green -- trust me, if it were, this sumbitch would've gotten its own full-fledged article. Rather, the soda is the traditionally black tar engine sludge hue we all know and love, and gustatorily, it is fairly similar to Coca-Cola ... except it has this weird smoothness to its aftertaste, almost like your gulping down a shot of sugary water after every sip. Ever the cola traditionalist, I like my soda porting about a viscous, burp-inducing caustic kick; sadly, this Stevia-infected offering just don't give it to the masses.

The drink coincides with the release of Pepsi True, yet another green-canned beverage filled with pseudo-sugary agents. The same way the reduced-but-not-that-reduced sodas of years past have faltered (Pepsi Next, I'm looking at you, G), I just don't think either Green Coke or Green Pepsi will prove sustainable. Soda drinkers and health-savvy folks really aren't one and the same, and you're not going to get the six pack a day man to switch from normal Coke to pussy Coke for any kind of half-baked health reasons. Furthermore, the health-conscious folks won't go anywhere near sodas, no matter how much Stevia and cane sugar is thrown into the mix. These new "green beverages," I am afraid, are just sodas sans a country.

Enjoy the novelty while it lasts, kids ... next Christmas, the only way you'll probably be able to taste this stuff is if you order it online or buy it from one of those weird-o mom and pop gas stations that still have posters from 2006 taped to their freezers.

Sunday, December 21, 2014

‘90s Coin-Ops That Never Got Home Ports

Five coin-devouring classics that, for some reason, never got the SNES or Genesis treatment.


“Street Fighter II,” “Mortal Kombat,” “NBA Jam” and “Samurai Shodown” were among the finest arcade games of the Flannel Era, and thankfully, we got an opportunity to play them on a litany of platforms, from our tried-and-true Sega Genesises and Super Nintendos to our battery-munching Game Boys and Game Gears. While a ton of coin-up offerings wound up getting home translations -- including subpar offerings such as “Primal Rage” and “Pit Fighter,” -- quite a few beloved arcade titles never got ported to our living rooms. While there is certainly no shortage of arcade originals that we all wished would have made it to our 16-bit units -- Capcom’s “Alien vs. Predator,” Namco’s “The Outfoxies” and Data East’s “Night Slashers,” among them -- there are five coin-ops in particular that, despite being extremely popular, were inconceivably never sent to the SNES or Mega Drive.

Today, let us reflect on the cabinets that filled our childhoods with such wonder and splendor -- and a whole hell of a lot of confusion and frustration as to why we couldn’t play them on our home units.

Number Five:
“Title Fight” (1992)


What was it? A good decade and a half before Nintendo even thought about “Wii Sports,” Sega gave us this aerobic pugilism sim, which allowed arcade jockeys the ability to slug it out with a who’s who of completely made-up boxers. Yeah, its roster of fictitious fighters may not have had the quasi-racist appeal of “Punch-Out!!,” but it did allow two players to engage in virtual slugfests thanks to a side-by-side, dual screen set-up. And playing as virtual wireframe cyber-boxer? Damn, that was a stroke of ingenuity!

Why should it have been ported to the home consoles? LONG before Nintendo trot out their 2006 motion-activated home unit, Sega was betting a buttload on its revolutionary peripheral, the Activator. Basically, it was giant DDR pad that you could plug into your Genesis, and every time you waved your arms like a windmill, it did something or another while you played “Eternal Champions” and “Mortal Kombat.” Granted, the tech really didn’t work they way it was supposed to, and after five minutes of punching air in “Greatest Heavyweights” you’d probably revert back to your six-button pad, but still -- it was hardware designed to get you active in the den. Sega could have easily released a deluxe port of “Title Fight” a’la “Lethal Enforcers,” complete with a pair of gloves you could sock over your hands for their home consoles, and it would’ve been a perfect fit for the ill-fated Activator. Of course, it probably wouldn’t have shifted that many units, nor lead to a great home console game, but for a company hellbent on releasing as many consumer-unfriendly devices as possible at the time, you just have to wonder why the house Sonic built never made an effort on this one.

So, uh, why wasn’t it? Sega was at its business apex in 1992. With the global success of the Genesis/Mega Drive, the relatively recent release of the Game Gear and the upcoming Sega CD add-on, Sega was certainly building quite the formidable holiday season armada. With all of that stuff going on, I suppose it’s reasonable to assume the company never really considered its weird-beard boxing game to be any kind of system-mover, although it most certainly could have been adopted to the Genesis or Sega CD quite easily. I mean, shit … if “Sonic Blast Man” could somehow find its way to the Super Nintendo, there really isn’t a reason in the world why this game never got a home console translation.


Number Four:
“Lucky & Wild” (1993)


What was it? In the early to mid-1990s, there were a lot of gimmick-heavy arcade games on the market. You had light gun games like “Terminator 2,” and you had vehicle themed cabinets like “After Burner.” Well, “Lucky & Wild” was a game that decided to give you two novelties for the price as one, as it was both a driving simulation and a killing people with guns simulation. Specifically designed for two-players, one gamer drove while the other pumped hot lead into various no-goodniks. And it was also a blatant rip-off of a Sylvester Stallone movie that’s sole cultural significance is being a throwaway line in a Tenacious D song, if you can believe it.

Why should it have been ported to the home consoles? While both the SNES and Genesis had their own proprietary light gun peripherals, there really weren’t that many standout games released on either system that took full advantage of the add-ons. As one of the better rail-shooters from the early 1990s, “Lucky & Wild” certainly would have lent itself to a decent-enough Menacer or Super Scope 6 offering, and who the fuck out there wouldn’t have loved plugging a gun and a steering wheel into their home systems to play the same game?

So, uh, why wasn’t it? Honestly, there wasn’t a whole lot of depth to “Lucky & Wild.” Like most arcade games from the era, it was over in about half an hour, and replay incentives were virtually nil. The appeal of the game, then, was the novelty of clicking yourself into a mock-up patrol car and blasting like a retard for a few minutes -- not exactly the kind of gaming experience that would be worth a $65 home cartridge purchase, necessarily. That, and I’m not really sure how many quality steering wheel add-ons were out there in the early ‘90s -- in short, the market appeal of the game just wasn’t strong enough, and the adoption base for the game’s needed peripherals wasn’t there to really justify a home port. Still, a retooled SNES or Genesis game, technologically, would have been possible -- although gunning and driving with a control pad wouldn’t have been anywhere near as fun as the coin-op set-up, obviously.

Number Three:
“WWF WrestleFest” (1991)


What was it? A good old-fashioned 2D slobber knocker featuring all of your favorite WWF rasslers, circa 1990, including the Legion of Doom, Hulk Hogan and that nefarious Iraqi turncoat Sgt. Slaughter. Developed by the same folks that gave us “Double Dragon,” the brawler featured huge sprites, incredible animations and a seriously addictive Royal Rumble mode, not to mention marking the only time fan-favorite tag team Demolition was ever featured in a video game. Up until the release of “No Mercy” on the N64, it was easily the best WWF-branded game ever.

Why should it have been ported to the home consoles? Well, if you ever played a 2D WWF game on the NES, SNES or Genesis, you’d know that they fluctuated in quality from dog shit supreme to just sorta above average. Clearly, there was a need for a great, licensed WWF game, and since that game already existed in the form of “WrestleFest,” you kinda’ figured somebody out there would’ve come up with the bright idea of concocting a console iteration, no?

So, uh, why wasn’t it? I’m not 100 percent sure here, but I’d venture to say it had something to do with licensing agreements. Much to the chagrin of pro wrestling fans the world over, LJN held a tight grip on the WWF home game contract for the better part of the 16-bit era, so I’m not sure if Teknos or any other publisher would have had the legal clearance to put the game on home units. Furthermore, pro wrestling casts change pretty much every three months, so by the time a “WrestleFest” port would’ve made it to the SNES and Genesis, it would have been pretty outdated -- although, that just would’ve meant we would’ve gotten Adobo-sized Papa Shangos and Skinners instead, which would’ve been fucking awesome, too.

Number Two:
“The Simpsons” (1991)


What was it? A Konami cabinet that was more or less “Final Fight,” only instead of playing as a dude that looked like Dan Severn, you commandeered Homer, Bart, Lisa and Marge and ran around slapping zombies and sumo wrestlers with vacuum cleaners and saxophones.

Why should it have been ported to home consoles? Primarily, for the same reason “Turtles in Time” was -- because it was a fun licensed game with a great multiplayer component. Considering how good “Final Fight” looked on the Sega CD and SNES, there really wasn’t any graphical reason why the game couldn’t have made a leap to the home units, either … and did I mention it was a motherfucking beat ‘em up starring the Simpsons?

So, uh, why wasn’t it? As was the case with “WrestleFest,” I’m pretty sure this had something to do with licensing. If I remember correctly, LJN/Acclaim held the home rights to the Simpsons, and most of those games were downright shitty with a capital “S” -- go ahead, try and fucking play “Virtua Bart” sometime. Since Konami held the home rights to both arcade and home TMNT games, I reckon that’s why we saw that one on the 16-bit consoles, while “The Simpsons” remained an arcade exclusive. Strangely enough, though, the game did get a home translation of sorts, as arcade ports made it to both the Commodore 64 and MS-DOS computers.

Number One:
“X-Men” (1992)


What was it? Only the goddamned hugest cabinet ever made, that’s what. The utterly massive coin-op allowed up to six-players to band together on a two-video-screen beat ‘em up odyssey, as friends and families came to death blows over who was going to be forced to play as Dazzler. It was also released as a four-player unit with a couple of characters cut out, but anybody alive in the Clinton years probably chooses to remember this one as the sextet gaming get-togethers to end all sextet-gaming get-togethers.

Why should it have been ported to home consoles? Because it made sense all the way around, that’s why. Console owners would’ve have loved having the game on their units, and Konami would’ve made a shit load of money by creating versions for the SNES and Genesis. Also, I don’t think the “X-Men” license was locked down the way some of the previously mentioned properties were -- how else were we able to get both a Capcom-produced SNES game and a Sega-produced one on the Genesis, after all?

So, uh, why wasn’t it? You know, I have no idea, actually. Yeah, it probably would have been impossible to create a true six-player home console game, but the game definitely could have been released as a solid two-player game. Since graphics and licensing likely wasn’t the reason why the game never made it to the SNES or Genesis, I’m really stumped as to why we never got a chance to play this one in our living rooms. Maybe it had something to do with the Fox cartoon that debuted around the same timeframe? Alas, as outdated as the character sprites may have been (remember, this thing was based on the one-shot “Pryde of the X-Men” pilot, after all), I don’t think any gamer circa 1993 would’ve complained about staring down the Blob, Pyro and the White Queen with their Super Nintendo pads in hand. I reckon this one will just have to be one of those old-school mysteries that will plague gamers for eons, I am afraid…

Wednesday, December 17, 2014

Why Console Gaming is Dead

The writing is on the wall ... television-based video game units are already an anachronism.


Friends. Family. The love of Christ. For many people, these sorts of things represent the holiday season. For me, however, Christmas has always meant but one thing: new console time, bitches.

As a kid, I looked forward to Dec. 25 all-year-long, because that meant, in some form or fashion, I was going to my grubby little paws on some kind of newfangled gaming machine. Whether it was a plug-and-play device from Nintendo, Sega, Atari or Sony or a poorly-received handheld from SNK, I knew that my day was going to be filled with all kinds of new and improved interactive experiences. Indeed, it’s the type of wonder and awe and sheer consumer bliss that makes us yearn for yesteryear, gloriously blind to the fact that everything around our Sega Saturns and Game Boy Colors at the timeframe pretty much sucked.

To me, console gaming really hits its zenith in the 128-bit era. With the Dreamcast and Xbox, we had pretty much gone as far as we could with our gaming devices, in terms of graphical horsepower. The units were more advanced than even the most high-tech arcade cabinets, which pretty much spelled the death of that quarter-fueled industry. With online play and DVD functionality, our units had become true multimedia devices, which in a way, marked the end of console gaming altogether.

The last console cycle -- the one with the 360, PS3 and Wii -- was ultimately the start of a slow, terminal industrial decline for TV-based gaming. With a greater emphasis on Internet-based multifunctionality, the hardware itself became less about gaming than it was peripheral interactions; hell, from 2010 onward, the Wii might as well been called “the Nintendo Netflix.”

With the costs of game development skyrocketing, in the face of a global recession no less, a lot of the great third party developers either went extinct or got gobbled up, and even the biggest heavy hitters of the gaming market decided to eschew innovation for proven capital generators. In short, that led to fewer game developers, fewer game publishers, and really, a lot less games worth a good goddamn. And then, the smart phone and tablet gaming market exploded, with casual gamers turning away from mass-marketed console and handheld units to get their “Angry Birds” and “Minecraft” fix via Android. Sure, indie developers could still put their little microbrew games on XboxLive and the Playstation Store and whatnot, but when you’re selling software at $5 a pop instead of $60 … well, it’s pretty easy to see how that’s a less than enviable prospect for would-be investors.

It’s a perfect storm of shit for the console manufacturers. From a marketing standpoint, how do you convince people to spend $400 on a new gaming rig, when you can play “Farmville” and “League of Legends” on the iPhone you have superglued to your hand at all times?

Back in the 1990s, the hardware selling point was always the proprietary software. If you wanted to play Mario and Zelda, you got a SNES, and if you wanted to play Sonic and “Mortal Kombat” with blood in it, you bought a Genesis. That’s why so many hardware manufacturers faltered in the heyday of “NBA Jam” and “Street Fighter II” -- with Sega and Nintendo, you knew you were getting a certain, exclusive brand quality. Now, who in the fuck knows what you were going to get with something called an “Atari Jaguar” or a “CD-I” or a “3DO?”

All of the would-be Nintendo and Sega usurpers faltered, because they had nothing to market to gamers other than the technology. The same fate would have befallen the Playstation, had it not been for Nintendo and Sega’s own inability to woo the masses with 3D technologies; Sony won the next two console generations simply because they were able to equate their hardware brand with versatile software quality, and all they had to do to shake consumer confidence in its competitors was point out how they tried to sell you tech sans the applications just one model earlier.

Which brings us to today, and the PS4/XBONE/WiiU era. Looking at the Metacritic scores for each hardware unit, I noticed something fairly surprising -- that being, the near total lack of proprietary brand differentiators. Yeah, Nintendo still has its “Mario Kart” and “Smash Bros,” but its top ten ranked games for the calendar year primarily consists of not just multi-platform games, but multi-device titles: offerings like “Guacamelee!” and “Child of Light” and “Steamworld.” There’s even less proprietary software quality diversity on the Sony and Microsoft units: the Xbox One is glutted with PC ports like “Dragon Age” and “Minecraft,” while the top three highest ranking PS4 games for 2014 are actually re-releases of games that came out in 2013.

So, uh, what’s the appeal to consumers to shell out all that dough to play games they can probably already play on their home computers or iPads again? Almost brazenly ignorant of the hardware failings of the past, the big three today are once again trotting out “technology for technology’s sake” as their central argument for user adoption.

Thankfully, the market seems to have gotten over the “peripheral madness” that was kickstarted by the Wiimote and “Guitar Hero.” The fact that Nintendo released a version of the 3DS sans any kind of 3D tech pretty much tells you that gamers’ infatuation with novelty controllers has gone the way of Mitt Romney’s presidential ambitions. Alas, while Microsoft and Sony have lost tons of money on their motion-sensing add-ons, they’re not sure how and what to market to consumer audiences anymore; and outside of bilking parents into paying money for Sky Landers toys, Nintendo ain’t doing much of shit with its little tablet connectivity set-up, neither.

Call me crazy, but I don’t think anyone in their right mind is going to buy “being able to play ‘Super Metroid’ or ‘Shovel Knight’ in HD” as reasons to plunk down an entire week’s paycheck on today’s uber-super-duper home units. Sure, there are “new” games being released, but most of them are perennial updates (Madden, FIFA, etc.), rehashes (Mario this, Halo this, Call of Duty this, etc.) or brown and grey war-shooting games, that to me at least, are utterly indistinguishable from one another. Seriously, if you showed me a photo of “Destiny” and “Titanfall” side by side, I’d have no idea which was which.

So outside of your gratuitous ADHD exploitation of middle schoolers (hey, a new Pokemon game!) and high school mass shooter fantasy fuel fodder, gamers today are left with precious little new to experience with their hardware. And don’t even think about trying to sell me on that “Walking Dead”  and “The Last of Us” serious-conscientious-gaming-as-philosophy nonsense; if I wanted to mull the peculiarities of man and the inherent sadness of existence, I’d read some Popper or Sartre, not play “Bioshock” and try to ring some sort of political statement gobbledygook out of fighting Scooby Doo monsters.

Kids today have migrated to phone and tablet-based games for a reason; the simplicity and instant gratification. No load-times, no bullshit self-congratulatory, high-falutin plotlines trying to mimic summertime box office fare and best of all, no television units required. All you need is steady Wi-Fi, a full battery and blood circulating to your thumbs -- who has the patience nowadays to find a television remote, anyway?

Call it perception bias if you must, but there’s simply no way I can imagine any kid in 2014 having as much fun with his or her Wii U or PS4 as I did when I got my Sega Genesis in 1993. All of the applications and online ranked match modes in the world don’t compare with the fundamental joy of being sucked into the experiential zone of new-and-improved software. There wasn’t just a graphical leap in the jump from “Super Mario Bros. 3” to “Sonic 2,” but really, an entire sensorial step forward; alas, I feel few youths feel similar experiences when upgrading from “Forza Horizon” to “Forza Horizon 2,” sadly.

That, my friends, is the real appeal of console gaming -- the ability to pull the gamer into a complete state of concentration and focus for hours on end. Shit, I vividly recall playing game after game of “NHLPA Hockey ‘93,” with the only worldly influence being the Dr. Dreadful gummy treats I was chewing on for sustenance.  Today’s hardware is literally built for multifunctional use, so you’re never really absorbed into the game; you’ve got online messages popping up while you’re playing games and all sorts of shit going on with your controller, plus, all the additional stuff going on with your phone and your headset and all that jazz. Instead of plunging headfirst into the software world, you’re just plugging yourself into various hardware features, with the software serving as a central hub to your technological interaction.

Simply put, that’s why gamers are abandoning “Grand Theft Auto” for “Fruit Ninja.” The same way all of the old school classics forced you to turn off your Aspergers to reach a new high score, these endless runners are physically and mentally challenging consumers, which is something console games really haven’t done since the 3D transition of the mid 1990s.

It’s not that today’s games are too complex, it’s that they are needlessly, ostentatiously complex. If you put “Diablo III” in front of me, I’d get bored with its huge-assed game sphere in five minutes, but if I encountered an old “Raiden” cabinet, I’d probably play that sumbitch for an hour solid.

Truthfully, console games haven’t really evolved since “Super Mario 64,” when the technology allowed for a sense of depth, immersion and genuine exploration that simply couldn’t be replicated in 2D space. Yeah, we’ve made prettier and deeper game worlds, but we’re still doing the same shit in “Skyrim” that we were doing in “Shenmue” and “Mega Man Legends.” After fiddling with analog nubs  and running around trees for 20 years, perhaps its not all that surprising that hardcore gamers want to return to the twitchy, linear madness of “Gunstar Heroes” and “Super Soldier Blade.”

The problem is, you don’t need brand new, super expensive hardware to create that kind of gaming experience. Shit, my laptop is utter garbage, but it can still play just about every neo-old-school game that’s received praise over the last few years flawlessly. Outside of mildly more refined graphics and smoother online deathmatches, there’s really nothing new that the PS4 offers over the PS3, other than less variety and higher consumer costs. With the Wii-U, you’re just getting slight retweakings of the games you’ve played a million times before, except now with a battery-devouring tablet that makes games less engaging than they were on the Gamecube. And the following are reasons why the Xbox One is a worthy investment over an Xbox360: none.

As a generation of hyper-mobile consumers, who prefer their multimedia in the Cloud, you really, really have to give us a good reason to plunk our asses down in front of a TV screen for more than twenty minutes.

With nothing but bland warm-overs and experiences ported over from more accessible platforms, the appeal of console ownership is nothing like it used to be. Multimedia purchasers today prefer to downsize their entertainment -- with a growing demand for completely atomized content, no less -- these hulking, wire-tethered monstrosities in our living rooms with the oh-so-passé disc trays and power sources the size of stereo speakers are instantly outmoded.

Console gaming, as a pastime, isn’t on the verge of becoming obsolete -- it already is. Sure, sure there are some purists out there who cling to their “Street Fighters” and their “Gears of War,” but their rank is ever-diminishing compared to all of the new “Dota 2” and “Flappy Bird” recruits.

With an ever expanding mobile base, there’s nowhere for the console market to go, except downward; and considering the general consumer ennui toward these latest home units, I think we may have just hit terminal velocity on the freefall to antiquity, folks.

Sunday, December 14, 2014

A Fond Look Back at “America’s Funniest People”

In the early 1990s, Uncle Joey, Harley Quinn and that chick from the Whitesnake videos hosted an “America’s Funniest Home Videos” rip-off … and somehow, it’s greatest cultural contribution became the catchphrase of a mutant jack rabbit/antelope creature. 


I had a pretty shocking epiphany a few days ago, when I realized “America’s Funniest Home Videos” was STILL on the air. Granted, it's hosted by that dude from “Dancing with the Stars” instead of Bob Saget, but you have to admire the show’s staying power, especially considering the contemporary competition from sites like YouTube. Come to think of it, “AFHV” more or less heralded the arrival of the Tube a good 20 years in advance, celebrating all forms of visual jackassery, nincompoopery and cruel embarrassment before the World Wide Web even existed.

For those of you who never experienced the joy and whimsy of the 1990s as it happened, you’re probably in the dark as to what “America’s Funniest People” was. As the title implies, the show was indeed a blatant imitation of “AFHV,” right down to the casting of a fellow “Full House” star, Dave Coulier. However, the show, which ran for a few years on ABC, diverged from its inspiration in quite a few ways. Please allow me to spend a thousand or so words reflecting on a reality TV program that no one has even thought about in at least 18 years, if you don’t mind.

The key difference between the two programs was the format. While “AFHV” was all about user-generated content of people being hit in the testicles on accident, “AFP” was a program that highlighted people intentionally trying to be hilarious -- usually, by telling really stupid jokes or doing really goofy things with their faces that are more stomach-churning than funny-bone tickling. Probably my all-time favorite “gag” -- and I mean that in more ways than one -- was this one dude who held up a tomato that he said was “not feeling well.” The jokester than proceeded to squeeze the fruit’s chunky red guts out to imitate vomiting; needless to say, there was a sudden surge in squished tomatoes at my house for the ensuing week.

When I say the jokes were lame, I mean they were bottom of the barrel, almost to the point of being Dadaist anti-jokes. One guy simply held up a couple of compact discs, flashed them like a drug dealer and lisped “see-deez?” while another explained to us what the term “hut” meant in Ebonics -- as in, if you dropped a hammer on your toe, “it hut.”

Of course, the show also allowed people to tell jokes with meatier narratives, but that wasn’t the appeal of the show. The appeal of the show was watching people flick their lips open and showing you their entire gum line -- in hindsight, the show probably had more in common with “Guinness Prime Time” than it did “AFHV,” actually.

Which brings us to the show’s other big innovation over “AFHV.” You see, “AFP” wasn’t content with just one host, you got TWO of ‘em, which included none other than Arleen Sorkin -- aka, Harley motherfuckin’ Quinn from “Batman: the Animated Series.”

So, Harley and Uncle Joey served as our curators, as various ugly people did their damnedest to make the masses chuckle. Periodically, the show would switch things up and have a few musical performances. To me, these always seemed out of place -- I mean, going from a dude sticking his face through a cutout of a small body and doing puns about hot dogs to a barber shop quartet interlude was just too spastic, even for the Sega Genesis generation. I’m pretty sure the Olsen twins made an appearance or two as well, but I don’t trust Google with my YouTube search history so I can’t confirm it for sure.

The scant clips of the program I’ve found online are pretty much the definition of cringe-worthy, a veritable treasure trove of armpit fart noise serenades and piss poor ventriloquism acts. Some of the segments were just so unbelievably abstract that I have a hard time believing the skits could be considered comedy and not some sort of subversive Duchamp social horror. Believe it or not, a television network once asked Wrigley’s Gum and Pepsi Cola to shell out money to sponsor a program that consisted of people doing weird things with hair dryers for two minutes straight -- shit, did the dude who directed “Gummo” serve as an executive producer on this thing or something?

Reading the Wiki entry on the show really blew my goddamn mind. Not only did the thing last longer than I thought it did (can you believe this thing had a five year run from 1990 to 1994?), but apparently, Arleen Sorkin is some kind of cryptoracist, claiming ABC shit-canned her from the show because they wanted someone browner -- although, I guess one could argue Tawny Kitean was indeed noticeably less white, but DIGRESSION.

So, yeah, the show kind of changed formats around 1992. With the former David Coverdale humper splitting hosting duties with Coulier, there was a greater emphasis on lengthier skits -- like, full-fledged parodies of “Dirty Harry” and “Indiana Jones” and whatnot -- and more recurring features, like this segment where people ran around playing really stupid pranks (like dressing up as gorillas and chasing people) and a little dunking booth game show thing were kids got a chance to drown their parents for missing trivia questions.

Which leads us to what is probably the only thing anybody really remembers about the show -- the Jackalope.

Never heard of the Jackalope? Well, maybe this little quip will refresh your memory: “Fast as fast can be, you’ll never catch me!”

Fucking everybody over the age of 20 knows that line, but I’m guessing a good three-quarters of those in the know have no clue where it came from. Well, wonder about your senility no more, because that was the catchphrase of a Dave Coulier-voiced rabbit-antelope puppet who was the star of live-action “Looney Toons” segments on “AFP” every week.

As with seemingly everything else on the show, the Jackalope skits in hindsight seem to be self-reflexive horror-comedies. Perhaps it is for the best of I let the segments below speak for themselves.



Perhaps an oblique homage to “The Toxic Avenger,” the Jackalope skits all seemed to have some sort of aggressive pro-environmentalism bent to them. Shit, there was even one skit I vividly remember where the mascot actually was transformed into some muscle bound freak of nature by toxic waste -- alas, some things are still a bit too obscure to make the Dailymotion rounds, I take it.

I think the show breathed its final breath around the same time O.J.-A-Mania struck, and it  promptly was all but forgotten for about 10 years afterwards. The show had a light revival when old episodes began airing at odd hours on TBS, but unless you were just hanging out in front of the tube at noon on Wednesday’s during George W.’s first term of office, you probably missed out on the re-runs.

Today, “AFP” lives on only in dust-coated VHS cassette spools and the foggy memories of  twenty-somethings who kinda’ remember the program existing, but not really. The bits and pieces of the program out there in Internet land really aren’t exactly what I would call nostalgia fuel, as the show comes off as more unsettling than uproarious -- odds are, if you see the shows now, you’ll have the same thoughts I did, which were “man, this shit was fucked up and why didn’t I turn to drugs after being exposed to this and the AIDS episode of Captain Planet.”

Still, it’s a relic from a bygone-era, and if you’ve got an hour or so to spare, I guess you could find less productive ways to waste your life -- like spending an entire weekend scouring the ‘net for information on a game show that used to have monsters chasing contestants through a supermarket.

Tuesday, December 9, 2014

Five Southern Traditions Nobody Talks About

The common experiences everyone in Dixie shares … that they don’t want the rest of the country to find out about. 



People tend to have one of two perspectives on the Southeastern United States. One perspective sees a particularly brutish, ass-backwards anti-culture, where racism and institutional classism runs rampant. The other depicts the Southland as a pastoral, picturesque wonderland, a place where all the old charms and values of yesteryear lingers on as an affront to modernity itself.

As always, the truth is really “none of the above.” Indeed, Dixie in the 21st century is both a goulash of widespread poverty and ostentatious suburban wealth, a land filled with methamphetamine and wilding out young uns and manufactured paradises where respectful youths sip sweet tea on Antebellum porches and everybody shows up on time for the annual downtown Christmas parade.

But, there are other time-honored traditions those south of the Mason-Dixon line aren't too fond of discussing with outsiders. You know, the southland ain't all gravy biscuits and crazy ass outsider art; here are five long-held Dixie traditions you probably won't hear Tennesseans or Louisianians boasting about on your next visit to Music City or 'Nawlins...

Watching Pro Wrestling with Your Racist Granny

It’s an inarguable fact: all people above the age of 62 in the American south are racist. I’m not just talking about white senior citizens, I mean all senior citizens: whether you’re the color of mayonnaise, Nesquick or Heinz 57, if you’re eligible for Social Security benefits in today’s modern South, you are indelibly a hate-filled, rancorous ethno-supremacist.

If you’ve ever wondered why Southern people, specifically the senior crowd, seem to have such a penchant for pro wrestling programming, that’s pretty much the reason why. Professional wrestling, by and large, is a gigantic universe of crude, cruel and borderline offensive racial stereotypes, all battling for metaphorical ethnic supremacy using fake violence. In fact, I probably first heard a majority of the five-star slurs thanks to my granny’s utter disdain for the Orient Express, Tito Santana and especially Ron Simmons, whom had the honor/misery of becoming the first black WCW World Heavyweight Champion.

Over the past 30 years, it’s amazing how little the professional wrestling industry has done to curb back all of the race-baiting. With a contemporary cast of characters that includes a Moslem terrorist, a gang of lawnmower riding Mexicans and an African American tag team known as “Cryme Tyme,” it’s arguably more ethnically-charged today than it was in the heyday of Sgt. Slaughter, the Iron Sheik and Saba goddamn Simba.


Having Relatives Show You How Big Their Dumps Are

The southern man takes great pride in even his most meager of accomplishments. That’s why, in the era of the Xbox and the iPad, horse shoes and cornhole remain astoundingly popular pastoral activities south of the Mason-Dixon line.

Combining that nearly biologically need to compete with a dearth of recreational resources, it’s probably not too surprising that southern folk invent some wildly unorthodox ways to outdo one another. As in, engaging in let’s see who can pee the furthest contests, which were indeed quite the popular neighborhood activities in my carefree days of youth.

But don’t think this is something that only the kids partake of. Oh, no sir-ee Bob. For reasons that completely defy explanation, I’ve noticed that true Sons of the South take enormous pride in the size, length and texture of their own excrement, having been yanked from slumber by more than one adult relative so I could marvel at their gargantuan turds coiling around the commode bowl. I had one uncle who even kept a Polaroid scrapbook of his own shit -- he was utterly convinced that one of them had to break the Guinness World Record for lengthiest poo, and eagerly awaited the day they mailed him a check for a million dollars.

Being Drunk at Wal-Mart 

Getting sloshed is definitely a Southern way of life. Likewise, frequenting America’s number one retailer is another time honored tradition for the sons and daughters of Dixie. Therefore, visiting Wally World while inebriated just makes all sort of sense, in a way that makes no sense it all. Primarily, because you’re too shit-faced to know you’re trying to carry on a conversation with an unintended checkout lane.

In every shitty small town in the south, the Wal-Mart is the proverbial center of the universe. In terms of footprint and daily volume, its almost always the biggest communal gathering place in the village; what the watering hole is to antelopes in the African Savannah, Sam Walton's monolithic discount department store is to poor rural and exurb people of all shapes, sizes and hues.

Growing up in a little burgh that was just then developing a taste for the crystal meth, me and my school chums spent many late evenings and early mornings. just ambling down the aisles of Wal-Mart while intoxicated. The idea, you see, was to get rip roaring drunk on cheap-o vodka in the parking lot and all of a sudden, the local depot of consumer misery turned into some sort of post-utopian wonderland, albeit one with edited gangsta rap CDs. Looking back on it, I'm not really sure what the appeal of drunkenly stumbling down the canned tomato sauce section at two in the morning was supposed to be, but it remained a popular pastime, nonetheless. Exemplifying the importance of this abstruse regional rite: I ran into a kid I hadn't seen in literally 10 years recently, and the first thing he said to me? "Hey, Jimbo, remember when we used to get drunk at Wal-Mart back in the day?"

Anticipating a Full Blown Race Riot at School

The southland is pretty much a racial powderkeg, waiting to explode at any minute. The strange thing is, despite having the most profound historical track record of racial unrest in the country, the modern southland remains the most racially diverse part of the country. In fact, the 12 states with the highest concentration of African-American residents are all in the American South, with the racial composition of local governments in Atlanta, Memphis and Birmingham looking suspiciously similar to that of the aggregate pro basketball team.

So, let’s do the mathematics on this one. It’s a two-dyad political power struggle, mounted in 300 years of racial fury. People are just jonesing to let that undercurrent of ethno-rage froth up like magma, and really, all it takes is just one teeny, tiny incident to flick off an eruption.

At my middle school and high school, our team mascot was a palette-swap of the Ole Miss Rebel -- a cartoon character clearly designed to resemble a slave owner of yore. Well, one year, our long-tenured (and white, of course) principal stepped down, and our new head honcho was an African-American. With the white folks silently uneased, the shit really hit the fan when a new design for the team mascot came out … and chuckles a plenty, the new logo was a mulleted brigadier general, with a facial complexion a few shades south of “acceptably olive.” It may sound stupid to the rest of society, but that little decision almost led to our small hillbilly hamlet turning into Ferguson, Miss. A week later, a white kid slung an eraser tip at a black kid in geometry class, and holy shit, everybody in town thought the National Guard was going to have to come in. Of course, such tempers always simmer down to a light boil, but that friction is an omnipresent element in the Southland -- one of the quaint joys that kids in Caucasian utopias like New Hampshire will never, ever comprehend.


Fearing that you May Have Unintentionally Engaged in Incestuous Activity

Yeah, yeah, we all know the stereotype, which was more or less culturally codified by countless episodes of “The Jerry Springer Show” back in the late 1990s. Southern folk, for whatever reason, have a peculiar taste for their own kin, with a cultural depiction running the gamut from innocuous first cousin French kissers all the way up to full blown sibling-humpers.

While that little stereotype is erroneous for several reasons (historically, incest has  always been an activity of the upper crust and not the lower mantle -- lest we forget, Eleanor was a Roosevelt way before she married FDR), there is an uncomfortable nugget of truth to the longstanding belief. You see, it’s not that Southern people actively seek out their own blood to bone, it’s just that so many people in small towns are somehow genetically linked that really, you’re probably only four or five leaps away from encountering some kind of distant relative.

That’s why no matter who you’re dating in the little burghs, there’s a still a slightly-above average chance you’re re-stirring your own genetic batter. I had one friend who was seduced by his sister’s hot girlfriend from out of town, only to run into her at an extended family reunion a month later. To be fair, it was a sizable leap in genetic material -- we’re talking third or fourth cousin, once removed -- but they still shared a common forbearer.

Alas, it’s a shame the South must continue to live with, if for simple geographic limitations. But as a positive? That means that for the next few decades at least, you can actually use an oblique reference to “mitochondrial eve” as a pick up-line in Ol’ Dixie.

Sunday, December 7, 2014

“The Hunger Games: Mockingjay Part One” Review (2014)

Who’d thunk junior high schoolers would’ve had such a penchant for paramilitary politics?


Watching “Mockingjay,” I just had to keep asking myself what the popularity of “The Hunger Games” really means for us as a culture. To kids today, what does the dystopian hellhole of Panem actually symbolize -- is it a metaphor for the police state, or an allegory for our collective loss of privacy in the Facebook era or the still palpable auger of the Great Recession and a life shittier than those lived by their parents?

The quite literally mock heroics of Jennifer Lawrence, I believe, indicate but one thing: goodness, do American young ‘uns in the year of our lord 2014 really, really want to taste war and crushing poverty.

Whether you choose to view Suzanne Collins’ much beloved series as “Battle Royale” lite or paramilitary pandering to the “Shelby Woo” weaned masses, its undeniable that “The Hunger Games” franchise has struck a chord with Gen Y. Politcos on the left and the right both say the books and corresponding films are indicative of tweens leaning toward their ideology, but mayhap kids today ACTUALLY do want to find themselves thrust in guerilla juntas and daily life-and-death struggles the same way Miss Everdeen does?

As I’ve written before, good old fashioned wide scale warfare seems to be the best cure for the adolescent blues, and in my eyes, the success of “The Hunger Games” is all the evidence I need to support the theory. Our gilded youth are sick and tired of being coddled at home and going through the rigors of post-recession ennui; they want to form a human wall and overpower security guards and blow up dams, by golly, and all Mrs. Collins is doing is giving the peoples the celluloid fantasy they thirst for.

I haven’t seen the first “Hunger Games,” but I did see “Catching Fire.” I’m probably missing a whole lot not having seen the 2012 film (considered by some retards to portend the Sandy Hook school massacre, if you can believe it), but from the gist of the last film, I reckon I have a pretty good grasp on what’s going on heading into this third flick.

So, Katniss and Peeta were forced to compete in “The Running Man, Jr.” and after they won, they became national celebrities and marketing tools for the Capital -- the big, super evil federal overseer who puts the boot to any poverty-stricken dissidents that even think about asking for potable water. But, at the end of the last movie, Katniss is rescued by some sort of anti-government revolutionary group, led by Julianne Moore in a bad wig and Philip Seymour Hoffman, back when he was still not dead and stuff. By the way, with those eyebrows, holy shit, should he have played John Madden if they had ever made a biopic about him.

“Mockingjay,” then, picks up precisely where the last film left off. Katniss has been recruited by the resistance to become the literal face of their propaganda efforts, which basically entail her poorly reading lines in front of a green screen while a dude in a wheelchair edits it in Photoshop. Meanwhile, Peeta -- the albino love interest who’s so pale, he makes Ed Cullen look like a Puerto Rican -- is up there in the Capital, where Stanley Tucci feeds him lines in bullshit government-fabricated television interviews.

So, the resistance decide to take a page out of the oppressor’s book and cook up their own wartime propaganda, even hiring this blonde chick with a Trent Reznor-circa-1990 haircut and a dude without a tongue to walk around filming her in the middle of war zones while she points at rubble and makes aircraft carriers explode with one arrow. Oh, and at one point, Woody Harrelson shows up, and he’s all drunk and stuff. Or talking about wanting to be drunk and stuff, which I supposed wouldn’t be all that indistinguishable from the actual words that come out of his mouth on a daily basis.

So, the Capital decide to order an air strike on the resistance compound, but they can’t really find it, and there’s this part where Katniss’s sister tries to find a cat when the airlocks are about to close shut, and when the bombing is over, there’s a whole bunch of white roses everywhere, which I think is supposed to be symbolic and stuff.

Then, Katniss records a song that sounds a lot like a tribute to Jim Crow era lynchings, which apparently galvanizes the downtrodden into picking up their arms and sticking it to the man. Eventually, the resistance decide to storm the Capital and rescue Peeta by knocking out their power grid, and there’s this one part where Katniss and President Snow -- played by Donald Sutherland, who by some voodoo spell, is still alive -- and everybody thinks he’s going to capture the raiders, but they manage to get out alive unscathed. Except for one catch: when Peeta awakens, he’s all psychotic and stuff and he tries to strangle Katniss to death. Apparently, the feds gave him some crazy juice and subject him to the old “Clockwork Orange” treatment so now he’s been all reverse psychology-fied to hate his girlfriend and stuff.

And on that note, the movie ends, leaving the door wide ass open for “Mockingjay Part 2,” which will undoubtedly make a ton of money at the Cineplex next autumn.

All in all, I thought “Mockingjay” was a pretty decent movie, but a bit of a step down from “Catching Fire.” I thought the mock Gaullism going on was a bit much, and I strongly preferred the sillier violence of the last flick -- complete with blood storms and stinging insect attacks and whatnot -- over the blunter bullet-to-the-skull fascist mayhem on full display here.

Alas, I have a hard time chastising any movie that glorifies anti-statist furor to middle schoolers, and if given the choice between the skirts from Frozen or Katniss’ federalist-slaying ass? Yeah, I’m going to encourage my daughter to model herself after the character who has no qualms about exploding public infrastructure, thank you very much.

My Score:



Two and a half tofu dogs out of four.