Thursday, November 26, 2015

Sequels, remakes, and reboots (oh my!)

There are a lot of sequels, remakes, and reboots on the table in Hollywood land right now. I have to be honest and admit I don't really grok the reboot concept. To me, reboots are just remakes. I guess I kind of see it in the Star Trek series, insofar as the latest two movies "reimagined" the Star Trek universe by killing off Kirk's father, thus delaying his rise to the captaincy of the Enterprise (but he does get there, all the same). Still, I don't know that "reboot" is a needed characterization. Of course, some say that reboots refer to a series of films, alone. Thus, one remakes a single film, but reboots a series. Sequels we all understand, I think. Though I guess maybe I should have included "prequels," as well (but that would have ruined by Wizard of Oz reference).

Regardless, the point is that these are the kinds of movies that seem to be dominating Hollywood right now and have been for some time. I have heard or read about a number of the upcoming ones, many slated to go into production soon, others still trying to get all their ducks in a row, but I have to admit that I had no idea until yesterday that a Point Break remake was coming out this year (it's scheduled to be released on Christmas Day). Only just recently, I watched the original with my fifteen year old son (I've been taking him on a tour of the "classics" for some time now). And not all that long ago, I heard that the Rock was interested in doing a remake of Big Trouble in Little China (one of favorite goofy action movies of all time).

I know a lot of people kinda get annoyed at all of these productions, especially people with artistic/creative bents (like many of my writer friends at AbsoluteWrite). The basic argument against them, especially reboots and remakes: the original movie(s) was(were) great, what's the point in doing it over again? The point, of course, is to make money. Still, the response to that is simple as well: there are plenty of original movies that can still be made.

Personally, new versions of older films (or new sequels to the same) don't bother me much in theory. If the story is good and lends itself to a new telling, why not? After all, some of my favorite movies of all time are actually remakes, like Ben-Hur (1959), The Magnificent Seven (1960), and The Thing (1982). And apparently, there is another remake of The Magnificent Seven in the works starring Chris Pratt. The point is, remakes can be good things, if done well.

The problem is, in practice they are not always done well, are they? Still, so what if a remake sucks? It doesn't diminish the original (or the previous remake that was good) at all, in my opinion. And of course plenty of non-remakes are released that suck mightily, too. Ultimately, the cultish kind of response to remakes—"how dare they remake that movie"—is kind of empty-headed, I think. To be sure, I've engaged in it. When I first heard about the Point Break remake, this was the first thought that flashed through my mind.

But then I slowed down and thought it through. I had the same initial reaction when I heard about Rise of the Planet of the Apes. Then I took my kids to see it. It was great. So there is always that possibility in a remake (or a reboot): it could be as good or better than the original. Still, some might say—given the number of awful remakes and reboots (I include the new Superman and Spiderman movies in this group, along with the new Total Recall, of course)—the probability is too small to justify this barrage of remakes, reboots, and sequels coming out of Hollywood (which currently includes remakes of The Birds, The Wild Bunch, and Wargames, by the way). A fair point, but I think there's something else at play here.

Once upon a time, movies could only be seen in movie theaters. With the advent of television, that changed somewhat, as some movies would eventually make their way to the small screen, albeit after some extended period of time. Gone With the Wind, for instance, premiered in theaters in 1939. It wasn't seen on television until 1976. Now, it's on every year. Think about that for a moment. The movie was re-released a number of times after its original run, to be sure, but for people born after its release (or who were very young at the time), it was something of an unknown quantity.

Then came home video. Then, finally, the video-streaming world of today. One can now watch pretty much any movie at anytime. And this has bred a familiarity with themes, I think, to the extent that filmmakers really know what people like to watch. Not only that, such watching habits prep people for similar films. Remakes, reboots, and sequels make sense, not only financially, but also—oddly enough—artistically. Even the players, the actors, are falling into this line of thinking. The big stars have favorite movies just as we all do and the non-stop availability of the past probably influences their tastes just as much as it does ours.

That said, there is an underlying banality to all of this, a kind of red-lining for creativity in general. And there is that creeping notion of "everything's been done before." It all suggests a rather blah future for movie-making, insofar as the hype becomes more important than the actual product. Still, if the product is good, it is good, no? I guess we'll have to wait and see...

Wednesday, November 25, 2015

Shackles of terminology: Clinton hoist with her own petard

Recently, I detailed how Hillary Clinton attempted to give the impression that Marco Rubio's use of the phase "radical Islam" was somehow a reference to Islam or all Muslims in general. She was ably assisted in her dishonesty by both Jonathan Chait and Peter Beinart, who happily wrote articles that misrepresented Rubio's position in service to their own partisanship.

The whole incident reflects an obsession with terminology by many people on the Left, and—to be fair—some on the Right. This obsession has led to a steady expansion of the theory of "microaggressions," to the extent that any sort of reference that can be linked to the dominant social group is labeled a micrsoaggression almost as a matter of course by people obsessed with these things.

Supposedly, microaggresions are seemingly banal statements or actions taken by the socially dominant group (i.e., white males) that actually insult or otherwise degrade a member of a marginalized group. Many of the examples of microaggressions one finds on the 'net really aren't microaggresions at all, with respect to the actual theory, though. They're obvious instances of disparaging or ignorant statements (or actions). It's the ones that aren't so obvious that actually fit the definition. Here's a handy-dandy chart of microaggressions. Again, I think some of the examples really don't fit the bill, as they're not "micro" at all, insofar as they are obviously intended to belittle. But others, well that's where the argument really exists.

But the problem with this theory is that it lacks any means of verification. One can posit that a microagression has occurred, but this is just not empirically verifiable, mostly because the evidence for a microagression is all in the victim's head, nowhere else. Consider the example from the above chart, "There is only one race, the human race." Supposedly, that's a microaggression because of how it makes someone feel when it is said to them. But suppose it doesn't bother a particular someone at all, even though they are a member of a marginalized race, because they feel the same way? Thus, the theory falls apart, since there is no "aggression" to speak of. Or suppose a member of a marginalized race says the same thing to a member of the dominant race. Again, the theory falls apart, since it lacks any sort of consistency.

But even without the hokum of microaggression theory, this obsession with terminology still reveals itself, as various special interest groups make a point of complaining about terminology that they find offensive and members of the media latch on to terminology to "expose" on thing ot another. On the Right, there was the nonsense of "homicide bombers," and of "freedom fries," lame attempts to undo clearly understood terminology for no good reason.

On the Left, there is the constant preoccupation with terminology that references racial/ethnic groups; some terms are acceptable one minute, then not acceptable in the next, simply because someone divined a "problem." Then there are instances like the above, where a clearly understood term—radical Islam—is intentionally misrepresented for political purposes. And politicians like Clinton are happy to make use of these things, as a matter of course.

Then came yesterday, wherein Hillary Clinton caved in to pressure from some far Left groups and apologized for her use of the term "illegal immigrants" at a recent campaign event.

Wait, what?

What's wrong with "illegal immigrant"? For point of reference, here is Clinton's use of the term in context:
Well, look, I voted numerous times when I was a senator to spend money to build a barrier to try to prevent illegal immigrants from coming in. And I do think you have to control your borders. But I think that it’s also true that we need to do more to try to number one, deal with the people who are already here, many of whom have been here for decades. Because it is just never going to happen that we’re going to round-up and deport 11 or 12 million. I don’t care how tall the wall is or how big the door is, that is never going to happen. And I think it is an unnecessarily provocative thing to say. We need to secure our borders, I’m for it, I voted for it, I believe in it, and we also need to deal with the families, the workers who are here, who have made contributions, and their children.
Note that her use of the term was strictly in reference to people attempting to enter the United States illegally. She doesn't use the term in reference to so-called "dreamers" (children born to undocumented immigrants), nor even to illegal immigrants who have been living and working here for some period of time. She only used it in reference to people actively seeking to skirt the immigration process.

Millions upon millions of people have entered the United States through the proper immigration channels. Such people are or were immigrants, as a matter of definition. But with respect to how they entered, what term better captures this than "legal immigration"? None. Because there's a contrasting means of getting into the country, where one does not use the proper channels. And what better term captures this than "illegal immigration"? Legal immigrants versus illegal immigrants, this is a rather simple dichotomy, no different than legal driver versus illegal driver, legal drug versus illegal drug, legal bookmaker versus illegal bookmaker, etc.

Yet somehow, the use of the word illegal in reference to immigrants is unacceptable. From the above piece:
Not just activists but immigrants in general see the term 'illegal' as a pejorative ... it's meant to dehumanize people," she [Astrid Silva] said. "That is why news organizations like AP and CNN have added to their style manual guidance against using the terminology."
It's true, the AP has limited the use of the term "illegal immigrant." Here's the relevant guideline (as of 2013):
illegal immigration Entering or residing in a country in violation of civil or criminal law. Except in direct quotes essential to the story, use illegal only to refer to an action, not a person: illegal immigration, but not illegal immigrant. Acceptable variations include living in or entering a country illegally or without legal permission.
If I read that guideline correctly, people who entered the country illegally engaged in illegal immigration, but are not illegal immigrants, I guess in the same way that someone who robbed a bank engaged in criminal behavior but is not actually a criminal? Ridiculous. Are we no longer adults? Do words no longer mean what they actually mean?

And Hillary Clinton has kowtowed to the pressure, has apologized for speaking in plain English, for using a clearly understood term that disparages no one, that is simply definitional, nothing more. That's leadership...

Tuesday, November 24, 2015

Nader man-splains, Yellen idiot-responds

So...many people may have missed Ralph Nader's "Open Letter To Chairwoman Yellen From the Savers of America," that he published on Huffpo at the end of October. In general, I don't have much use for Nader, but he does hit on some good points in the letter. A snippet:
We follow the reporting on your tediously over-dramatic indecision as to when interest rates will be raised - and no one thinks that when you do, it will be any more than one quarter of one percent. We hear the Federal Reserve's Board of Governors and the various regional board presidents regularly present their views of the proper inflation and unemployment rate, and on stock market expectations that influence their calculations for keeping interest rates near-zero. But we never hear any mention of us - the savers of trillions of dollars who have been forced to make do with having the banks and mutual funds essentially provide a lock-box for our money while they use it to make a profit for their firms and, in the case of the giant banks and large mutual funds, pay their executives exorbitant salaries...
That's quite fair, in my opinion. The Fed has been using interest rates as a tool to manipulate the economy, mostly to cover up problems therein, for far too long. In my opinion, it shouldn't be allowed to touch the rates, except in extraordinary circumstances, because Nader is right: it's beyond useless to save money right now and has been for over a decade, truth be told (the previous Administration was doing the same sorts of things, after all).

But beyond him just being right about this, he also notes something of vital importance, the assumption by the Fed that there is a "proper" inflation rate, a "proper" unemployment rate, that these things are there to be controlled, can be controlled, and should be controlled. And of course, the people in charge assume they know what the proper rates are and assume they know exactly how to produce them.

This has been a major problem in the field of economics as relates to government policy for a long time now. Many of the heavyweights (well, they assume they are heavyweights) in this field, from officials like Bernake, Yellen, Gethner, and Lew, to people in the private sector like Krugman, Summers, and Corzine, think the economy is a system that can be easily controlled. They wrongly believe that government policy with regard to interest rates and the money supply (and tax rates and other policy tools) represent simple inputs into the system, to the extent that changing these things will have a corresponding impact on the economy, will lead to entirely predictable results.

Interestingly enough though, the above portion of Nader's letter is not what received the most attention. Rather, it as this portion, near the end (my boldface):
Chairwoman Yellen, I think you should sit down with your Nobel Prize winning husband, economist George Akerlof, who is known to be consumer-sensitive. Together, figure out what to do for tens of millions of Americans who, with more interest income, could stimulate the economy by spending toward the necessities of life.
Annie Lowrey at New York Magazine called the above bit "mansplaining":
Worst, Janet Yellen, with her small lady brain, has failed to grok that low interest rates harm savers. She'd better sit down with her husband so he can explain that to her!
Of course, this isn't the first time Nader has been called to the carpet for being misogynistic. His rant about Hillary Clinton to Larry King oozed sexism, there's no way around it. From it:
King asked Nader about recent accusations that Nader has lobbed at Clinton, namely that she evinces a “shocking militarism that is a result of trying to overcompensate for her gender by being more aggressive and macho,” and that she’s “reversing the tradition of women of peace.”

“But isn’t she moving more toward the left?” King asked.

Nader, in a performance that most of us who lived through the 2000 election remember, scoffed at this idea, painting Clinton as a warmonger who would put any Republican to shame. “She’s never seen a war she doesn’t like. When she was on the Senate Armed Services Committee, she never saw a weapon she doesn’t like,” Nader ranted.

Then he went full-blown sexist, decrying “the tradition of these macho women who, when they finally get responsible positions—like Madeleine Albright, Condoleezza Rice—it’s like they have to out-macho the men instead of saying, ‘We come from a peace advocacy tradition.'” He then went on to spell out how women, in his eyes, have a unique responsibility to be peaceable, because of the historical origins of Mother’s Day.
Yuck. That's pretty ugly stuff. One could argue that when Nader tells Yellen to "sit down with...[her] husband," he's merely suggesting that she get some help on the issue from a highly respected economist. But given Nader's past misogyny, I think such a reading would be a little naive; he was being a sexist pig in his letter to Yellen in my opinion, there's no question about it. Calling it "mansplaining" is completely fair.

That said, Nader's complaints are also still fair, are not undone by his obvious sexism. And surprisingly, Yellen decided to respond to Nader, as detailed by Ylan Q. Mui at WaPo. With regard to the sexism, Yellen takes the high road and ignores it. With regard to the meat of Nader's letter, Yellen says the following:
Would savers have been better off if the Federal Reserve had not acted as forcefully as it did and had maintained a higher level of short-term interest rates, including rates paid to savers? I don't believe so. Unemployment would have risen to even higher levels, home prices would have collapsed further, even more businesses and individuals would have faced bankruptcy and foreclosure, and the stock market would not have recovered. True, savers could have seen higher returns on their federally-insured deposits, but these returns would hardly have offset the more dramatic declines they would have experienced in the value of their homes and retirement accounts. Many of these savers would have lost their jobs or pensions (or faced increased burdens from supporting unemployed children and grandchildren).
Note the absolutes in Yellen's response: "unemployment would have risen," "home prices would have collapsed," "savers would have lost their jobs or pensions," etc. All of these things would have happened, if the the Fed hadn't acted as it did act, end of story. There's no room for "might have" or "could have" here, except in one case: "savers could have seen higher returns." The arrogance is striking. Yellen supposes she has absolute command over the economy, that it is controlled wholly by Fed policy, that such policy had perfectly predictable results and that any change to past policy would have had drastically different (though still predictable) results.

Beyond the arrogance, this is just stupid, the supposition of absolutes in all of this. But again, Yellen is hardly alone in this regard, as a noted above, so her sex is entirely inconsequential (before anyone accuses me of mansplaining).  The economy is an open, complex system. Changing things like interest rates will impact it, no doubt. And in the short term, maybe the impact can be anticipated to some extent. But Yellen is looking at the long-term here as well. Assume, for instance, that the Fed hadn't systematically slashed the interest rate starting in 2008  (better yet, assume it hadn't starting in 2001). Perhaps unemployment would have been even higher in the short term, perhaps home prices would have collapsed even further and the market would have dropped even more. Corrections happen. The business cycle is a reality, Bill Clinton's arrogant claims to the contrary notwithstanding.

The assumption that the economy could not have adjusted and recovered without action by the Fed is untenable. Again, such an assumption reflects an arrogance on the part of some economists that reality simply does not justify. They didn't see any of the problems coming, after all. Their crystal balls are defective and always have been, because they proceed from a flawed set of assumptions grounded in an unreality: that the economy can be defined and controlled by mathematical functions, when it absolutely cannot (because the economy responds to billions of individual choices and to outside events, none of which allow for mathematical certainties).

So yeah, Nader is mansplaining. And it's fair to chastise him for such overt sexism. But Yellen's response doesn't counter Nader's other points in the least. Because it's a response built on arrogance and stupidity.

Monday, November 23, 2015

A Charlie Brown Christmas: 50 years old, still relevant

In just a little more than two weeks, the TV special A Charlie Brown Christmas will be fifty years old. It premiered on December 9th, 1965 on the CBS television network. The USPS is honoring the special with a series of stamps it unveiled earlier this year. And ABC—which now owns the broadcasting rights to the special—will be airing an hour-long retrospective about the the special entitled It's Your 50th Christmas, Charlie Brown on November 30th, followed by a digitally remastered showing of the special, itself.

Of course, the special is commercially available these days pretty much everywhere, from grocery stores and drugstores, to stores like BestBuy and Amazon. I myself have a copy, as it's a part of the Peanuts 1960's Collection, which I've owned for a number of years now (also, the 1970's Collection).

Now I bought that collection so my kids could enjoy the shows any time they wanted to (okay, I like them too), but I've noticed something, with regard to the Peanuts holiday specials: my kids still want to watch them when they're on TV. In fact, they prefer to watch them on TV, even with the attendant commercials. I think it has something to do with the watching turning into a family event (they want me to watch with them), since we have to watch on a certain day at a certain time (no DVR-ing!). And that's a good thing, I think, especially in today's world where video on demand has basically taken over, whether one is using a service like Netflix of Amazon Prime, or one is using a DVR. There's something to be said for waiting  and anticipating, after all. There always has been and maybe we are—as a society—losing sight of that a little bit.

What better way to remind ourselves and our kids of this than scheduling a watching of A Charlie Brown Christmas?

But the special is full of many other worthwhile remainders, as well. And when thinking about them, remember that the special came out fifty years ago. One of its chief themes is the commercialization of Christmas, a theme that is regenerated year after year after year. Yet, we have managed to never actually take a step back. People complain incessantly about this commercialization, but someone is going shopping on Black Friday. Someone is buying Christmas decorations and the like in November (if not earlier).

Peanuts Christmas stamp from the USPS
There is also the related theme of the loss of the True Meaning of Christmas. In the end, Linus takes it upon himself to remind everyone what this is by reciting the Annunciation to the Shepherds from the King James version of the Bible. Charlie Brown thinks he understands, but really doesn't, not until his friends help him save his small Christmas tree, for it is their goodwill (that they take a long time to find, no doubt), their love and fellowship that "saves" Christmas. Yet, as everyone knows, Christmas often seems to be a time of short fuses, so to speak, when people exhibit even less patience than usual. There is perilously little fellowship on display during the holiday season, in my experience.

So as I'm writing these words, the idea of being able to say "let's really watch A Charlie Brown Christmas this year, let's really try to heed the lessons therein" is where I thought I was going. Yet, I could have said the exact same thing for the past forty-nine years. Because the lessons where as valid in every one of those years as they are today. Again, we haven't managed to do anything differently, by and large. Some might argue that we've actually gone in the opposite direction, have made things worse.

While all that's true, I'm still thinking it's a good idea to say it, so I will:
Let's really watch A Charlie Brown Christmas this year, let's really try to heed the lessons therein.
Because maybe it's the reminders, the trying—even if seemingly futile—that keeps us from going over the edge, that helps us pull ourselves back from the brink.

One for the road in Homestead: Jeff Gordon leads a few laps, walks away

The race in Homestead—the Ford EcoBoost 400—started a little late yesterday, thanks to some typical South Florida rain. And that was maybe a good thing for those of us watching at home (I contemplated going, but frankly I wanted to watch the coverage as much as I wanted to watch the race). Since NBC had a lot of time to fill, almost an hour and half, they sent some time tracking Jeff Gordon's pre-race activities and by showing the driver intros in full (a first for any broadcast of a NASCAR race, I do believe).

The intros were, from my perspective, quite informative. As each driver was introduced and walked out on stage, there was generally some applause. The more popular drivers received quite a bit more, of course, drivers like Johnson and Earnhardt. And the last four introductions of the Chase finalists—Gordon, Harvick, Truex, and Kyle Busch—received a lot of applause, most especially Gordon (for anyone living on Mars, this was Gordon's final race). But apart from those guys, the driver who received the most applause by far was Matt Kenseth. And in contrast to that, both Joey Logano and Brad Keselowski were roundly booed by the crowd, especially Logano (and to his credit, Logano smiled and accepted it).

This all stemmed from a series of incidents between Kenseth and the Penske teammates Logano and Keselowski, which culminated in Kenseth intentionally wrecking then-race leader Logano at Martinsville (the Goody's Headache Relief Shot 500), effectively eliminating Logano from the Chase. I could go all fanboy here and detail the series of obnoxious acts by the pair of Penske drivers that created this situation, but it's enough to simply note the applause Kenseth received from the fans, both at Martinsville when he wrecked Logano and at Homestead when he returned from his two race suspension. But the more significant consequence here is probably that Kenseth opened a door at Martinsville for other drivers, and Jeff Gordon stepped through it, winning the race and securing a spot in the final four at Homestead.

This is what Homestead was really all about: Jeff Gordon's final race in the Sprint Cup series, his final race for Hendrick Motorsports in the legendary 24 car. Kyle Busch won the race and the Chase, it is true. And his story, coming back from a broken leg to qualify for the Chase and then actually winning it, is a great one. But it's going to be forever overshadowed by Gordon's presence.

And that's as it should be. There are a lot of stories out there now on Gordon, a lot of retrospectives on his incredible career. Gordon has been racing the 24 car for Hendrick Motorsports since 1993, since he was only twenty two years old. And he's stepping down when he could still race at a very high level, at the still not particularly old age (in NASCAR years) of 44. For comparison, Kevin Harvick is 39, Dale Earnhardt, Jr. is 41, Matt Kenseth is 43, and Tony Stewart (who will retire after next year) is also 44. Dale Earnhardt, Sr. was 49 when he was injured and later died from an accident at the Daytona 500. Bill Elliott's final Daytona race was in 2012, when he was 56. And Richard Petty formally retired at the age of 55 in 1992.

Gordon seems like an old-timer now because he started at such an incredibly young age. His teammate Jimmy Johnson (who is 40, by the way) started in the Sprint Cup (Winston Cup and Nextel Cup, depending on the year) series in 2002 at the age of 26, when Gordon was just 30 but had been racing in the Sprint Cup full-time for almost a decade. And by then, Gordon had already won everything there was to win. Yet, when people talk about who the greatest NASCAR driver of all time is, Gordon is rarely given top billing.

It's an interesting thing, the "best ever" conversation. Gordon peaked very early. And when he peaked, he did so largely at the expense of Dale Earnheardt, Sr., one of the drivers routinely labeled as "the best ever" (along with Richard Petty).  Gordon's numbers are probably never going to get him that "best ever" moniker, though. And that's partly because of his teammate Jimmy Johnson, no doubt (who may end up supplanting both the Intimidator and the King, before he finally steps down).

Yet, in my mind, Gordon is the best ever. For me, it's not even close. And it's not so much because of the races he won as it because of the races he ran, one after another, year after year. He's started 797 consecutive Sprint Cup races. His next closest still active competitors in this regard are Johnson and Ryan Newman, both with 504 consecutive starts. And he's been a factor in the great majority of those races. Gordon was even a factor yesterday, leading nine laps early on (to the unmitigated joy of the crowd) and ultimately finishing sixth.

But the competition he has been facing for the last decade or so is unparalleled in the history of NASCAR in my opinion. Drivers like Johnson, Harvick. Logano, the Busch brothers, Carl Edwards, and Denny Hamlin are all capable of winning every race they enter. And then there are a bunch of up and comers, as well. The Sprint Cup series has never been so competitive. The great irony here is obvious: Jeff Gordon made this happen.

Jeff Gordon changed the NASCAR landscape from the moment he jumped in a stock car in 1991. He was an outsider then, a California boy who grew up on open-wheeled racing. Many might have supposed his future would be in F1 or Indycar racing. But Gordon took to stock car racing like a fish to water, much to the consternation of a good chunk of the NASCAR crowd who had borne and raised in stock cars (including Earnhardt, Sr. and Petty, by the way). He was an instant hero for NASCAR novices (like me, really) and his youth and looks attracted new fans and new sponsors to NASCAR almost immediately.

Fast forward to the twenty first century. NASCAR has become one of the most popular sports in the country. The Indycar car series is now but an afterthought, by and large. The drivers in NASCAR come from all over, are no longer products of just the South, and they are cultivated, scouted, and trained like other professional athletes. Every owner out there is constantly looking for the next Jeff Gordon. No longer do young drivers have to earn their rides by proving themselves for years and years in lower tier stock races. If they can win, they can drive. Next year, the 24 car will be driven by Chase Elliott, who is—wait for it—nineteen years old (and the son of Bill Elliott). Granted, there's the old south/traditional NASCAR connection there, but no 19 year old would have ever gotten a top ride prior to the arrival of Jeff Gordon.

Then there's the money in NASCAR. Top drivers are earning millions. They're in national ad campaigns. They're stars wherever they go. The net worth of Hendrick Motorsports is in excess of $350 million. Joe Gibbs Racing is worth over $200 million. All told, the top teams are pulling down close to $1 billion in revenue every year. And that revenue is a consequence of the mass-market appeal NASCAR has achieved in the past couple of decades, something it would have never been able to do without Jeff Gordon.

Finally, there is the actually quality of the competition in the actual races. Because of the money, because of the now wide-open range of drivers, races are more entertaining these days than they have ever been, especially given the fact that there are fewer fiery crashes (a good thing). The drivers are looking for an edge on every single lap, the competition is so tight. There are no more Bill Elliotts out there, able to drop to the bottom of the track and just pass the field like they were standing still. There can't be. And again, this is a change that Jeff Gordon wrought, probably to his own detriment in terms of wins.

Jeff Gordon is the best ever because there will never be another Jeff Gordon. There can't be. He changed NASCAR in so many ways (and yeah, I know there are some who don't like all of these changes) that his legacy is the current state of NASCAR, itself. He's why people started watching for the first time in the nineties, why they kept watching through yesterday. He's why NBC spent billions of dollars to secure the broadcasting rights for the Sprint Cup and he's why NASCAR teams are worth millions and millions of dollars. So, thanks Jeff. Good job.

Sunday, November 22, 2015

Rick-rolling ISIS: what could possibly go wrong?

The online semi-activist collective know as Anonymous is going to war against ISIS. It is dong so by taking down twitter accounts used by ISIS members (supposedly) and by spamming the extremists on social media in order to undercut their use of hashtags and the like to communicate and recruit. And apparently the spam of choice is "Rickrolling," a somewhat older technique that originated on 4chan in 2007 or so. For a detailed history of the phenomenon, read this.

This kind of "action" isn't a new thing for Anonymous in the least, as it launched a similar attack against ISIS following the Charlie Hebdo shootings early this year, though perhaps on a lesser scale (how "big" this action will turn out to be remains an open question). Most of its other actions, however, have generally had specific and more limited targets. Waging an online "war" with a group of terrorists who are also operating as a real-world armed insurgency is something new for Anonymous.

Of course, it's a war without guns, bombs, tanks, or the like. It's a war that is supposedly taking place in cyberspace and will be nothing but trouble for ISIS, while not endangering anyone else, whatsoever.

That's the theory, anyway. And it's a war that has lots of people openly cheering for Anonymous, or at least grinning wickedly based on the assumption that ISIS is going to regret pissing off this particular internet collective. Maybe that's how things will go down, maybe this action by Anonymous will throw a huge monkey wrench into the online operations of ISIS and help limit recruitment and communication to the extent that ISIS suffers in a meaningful, tangible way, that it begins to fall apart. And if that is how things go down, I guess the free world would be in debt to the hacktivists of Anonymous.

Color me doubtful.

It's easy enough to take potshots at Anon, who they are, their extra-legal methodology, and their lack of meaningful successes on any sort of grand scale (along with their fair number of outright mistakes), but that's not really my issue with the group and this particular kind of action.

First and foremost, let me be clear about something: I get Anonymous, insofar as I understand that I don't really get Anonymous at all. I do understand that it's not an organized group, that it doesn't have an established hierarchy (it does have a practical hierarchy, however), that anyone might claim membership, but that no one can absolutely be said be a member. Membership is wholly about self-identification. And I understand that actions taken under the Anonymous umbrella can vary greatly in nature and range, from takedowns of private entities just to show it can be done, to the specific targeting of individuals or groups because of their perceived unjust activities.

That's one of the problems when dealing with Anonymous: it's movement from a subversive pain-in-the-ass to a vehicle for righting wrongs and seeking justice. It's easy enough to cheer the rebellious bad boy who stands up to "the man" when "the man" is acting badly, is taking advantage of others, but a bit harder to rationalize that same bad boy's actions when those actions are self-serving, destructive, and/or hurtful to others who have done nothing wrong. In fact, even actions that seem to be about justice can have collateral damage.

I guess the question becomes the same one that can be asked of government actions, of all sorts (military, economic, etc.): is the action justified, regardless of the potential for collateral damage?

The difference is, governments can at least pay lip service to the idea that they are acting with the authority of their citizenship and can—on occasion—even be held to account for the choices made in this regard by the people in charge, whether through a forced change of leadership (at the ballot box or otherwise) or through courts of justice, whether national or international.

Anonymous as a whole, by definition, is not subject to these sorts of controls. And since its membership is fluid, individual members who might run afoul of the authorities and end up in jail cannot even be said to represent the collective, itself.

This troubles me. The cheering for Anonymous that I witness on the 'net, on social media and messageboards, strikes me as cheering simply for potential deus ex machina moments. The theoretical action by Anon against the theoretical big bad meanie is an action that no real person actually takes, it is sloughed off as a moment of cosmic justice.

The problem is, this isn't the reality. Deus ex machina devices are roundly criticized—and rightfully so, in my opinion—in movies and novels for being cheap ways out of sticky and difficult problems, because they remove the actual responsibility for solving things from the actual characters in the story. The overused line from Spiderman remains apropos: "with great power comes great responsibility." Anonymous has great power, it would seem, yet no responsibility. And even if we might be happy with some of the results that come from exercising that power, we shouldn't be happy with the inherent lack of responsibility in my opinion. That's not something to cheer for, it's something to worry about.

Friday, November 20, 2015

The Euro-Weenies thirty years later: They is Us

Back in my college—or university—days, I was pretty hip. No seriously, stop laughing goddammit. Living down in South Florida, I partook of the nascent 80's counter-culturalism, which in Miami mostly consisted of going to dance clubs, being able to handle the drug lingo of the day, praising the awesome realism of Scarface, and mocking the banana-hammock wearing European tourists on Miami Beach. Politically, there really wasn't a hip side, at all. Though that little shed on the far corner of a Homestead Air Force Base runway was a bit contra political.

The Original May 1986 Cover of Rolling Stone
Still, I read the appropriate intellectual material for the moment:  The Wave (soon to become The Miami New Times) of course, The Village Voice, The Miami Herald (specifically for Bill Cosford and Dave Barry), Playboy, and of course Rolling Stone Magazine. And in the last, in May of 1986—almost thirty years ago—appeared a column by P.J. O'Rourke: "Among the Euro-Weenies." I laughed my ass off reading it.

In the piece, O'Rourke uses his struggle to get to Tripoli after the 1986 United States bombings of Libya as a canvas on which to paint a fairly heavy-handed takedown of Europe, Europeans, and European culture. Moving from Paris, to Bruges, to Berlin, to London, O'Rourke mocks almost everyone and everything he encounters. And he doesn't do it by proclaiming how much better America is. Far from it. Rather, his criticisms are specific, if somewhat hyperbolic.

For instance, he says the following about Paris and the French people:
The French are a smallish, monkey-looking bunch and not dressed any better,on average, than the citizens of Baltimore. True, you can sit outside in Paris and drink little cups of coffee, but why this is more stylish than sitting inside and drinking large glasses of whiskey I don't know.
About Belgium:
After I was kicked off the plane to Libya, I went to visit my friend in Brugge, the one who was under instructions from the police to be ashamed. We spent the weekend looking for fun in Belgium, which is an isometric exercise. That is, it's a strain and you get nowhere.
About London:
London is a quaint and beautiful city—if you stick to the double-decker tourist buses. But the CND offices were out in the East End, in the aptly named district of Shoreditch. Dr. Johnson said, "When a man is tired of London, he is tired of life." But these days, he might just be tired of shabby, sad crowds, low-income housing that looks worse than the weather, and tattoo-faced, spike-haired pea brains on the dole.
About Berlin:
West Berlin is the city that Iggy Pop once moved to because New York wasn't decadent enough for him. I was expecting maybe Cabaret or maybe Götterdämmerung performed by the cast of La Cage aux Folles. Forget it. We bombed the place flat in WWII, and they rebuilt it as a pretty good imitation of Minneapolis. 
And that's just a small sampling of O'Rourke's caustic observations. But as I re-read the piece today, I started to get an uneasy feeling in the pit of my stomach. Fuck. We've become the Euro-Weenies. Maybe not all of us, but enough of us to be sure. Our cities are imitations now of European ones. You can't throw a rock without hitting a sidewalk cafe in any major U.S. city and there are more people lining up in Starbucks on an hourly basis than there are sitting in the corner bar throughout an entire day.

And the shame game, shit. Try some pro-American talk in public or on a social media platform. The self-appointed shame police will be on you in seconds, like fleas on a dog. And rather than stamping on crime or anything—or anyone—else, we perpetually wring our hands with angst, worried about the "message" being sent, about how it will be "perceived." By who? Who knows? Who cares?

About the only thing differentiating Americans now from the Euro-Weenies of the eighties is smoking. For whatever reason, we've decided smoking is the bane of civilization and have largely succeeded in eliminating it everywhere, outside of golf courses and strip clubs, to be replaced by yoga pants and hair product.

And it's this culture of self-important, judgmental snobbishness that must now deal with acts of terrorism.

The Libya bombings in 1986, the ones that had Europe in a state of outrage over the cowboy tactics of the United States, those were in response to a German nightclub bombing that killed one American serviceman. And now, after a series of bombings and shootings in Paris, there is at least one dead American. Our response? Hand wringing angst over the possibility of anti-Muslim sentiment. The French response? Bombing the snot out of ISIS targets. Where is John Wayne from, again?

Of course, our own Weenies don't have the guts to call out the French for such a response. They are even more afflicted with navel-gazing than were their Euro-Weenie predecessors. How's that for a kick in the teeth? We can't even do the obnoxious, self-righteous world-weary citizen correctly. We're stuck looking inward, never outward.

Maybe that's the cost of being the last stop on the Weenie train, though I guess we might eventually have a turn mocking the Chinese. Some day.

The Princeton protests: a war on Naming

Another protest event at an American university is in the news. This time, it's at Princeton University, an Ivy League school whose history extends back to 1746, making it one of the oldest institutions of higher learning in the United States (there's a dimwitted argument still going on between the University of Pennsylvania and Princeton, with regard to which is older). And because of that, its students, faculty, and administrators have seen it all; the school has been there through good times and bad. It's alumni include Presidents, a First Lady, Supreme Court Justices, famous actors and writers, scientists responsible for ground-breaking research, and titans of industry and commerce.

Like all colleges, Princeton has seen its share of student-led protests. The latest one occurred just two days ago when a group of students filled Nassau Hall and the offices of the university president, refusing to leave until their demands were met. Yesterday, the student group and the university president—Christopher Eisgruber—reached an agreement in this regard and the protest came to a quick end. The list of demands:
WE DEMAND the university administration publicly acknowledge the racist legacy of Woodrow Wilson and how he impacted campus policy and culture. We also demand that steps be made to rename Wilson residential college, the Woodrow Wilson School of Public Policy and International Affairs, and any other building named after him. Furthermore, we would like the mural of Wilson to be removed from Wilcox Dining Hall. 
WE DEMAND cultural competency training for all staff and faculty. It was voted down on the grounds of trespassing freedom of speech last spring semester. We demand a public conversation, which will be student led and administration supported, on the true role of freedom of speech and freedom of intellectual thought in a way that does not reinforce anti-Blackness and xenophobia. We demand classes on the history of marginalized peoples (for example, courses in the Department for African American Studies) be added to the list of distribution requirements. Learning about marginalized groups, their cultures, and structures of privilege is just as important as any science or quantitative reasoning course. We propose that this requirement be incorporated into the Social Analysis requirement. 
WE DEMAND a cultural space on campus dedicated specifically to Black students, and that space can be within the Carl A. Fields Center but should be clearly marked. The naming of this space should be at the student's’ discretion in order to avoid naming it after a white benefactor or person with bigoted beliefs, as evidenced by the naming of Stanhope Hall.
The student group behind the demands is the Black Justice League (which sound like a reference to Earth-23 of DC Comics' multiverse), which apparently formed in response to the events in Ferguson last year. From the group's "About" section of Facebook:
A coalition of students from Princeton University, standing in solidarity with Ferguson and dismantling racism on our campus.
But the above list of demands seems a far cry from the events in Ferguson, to put it mildly. After all, we're talking about an Ivy League school here, one with fairly high academic requirements for admission and with exceedingly high tuition costs, as well. One would think a group of intelligent, concerned activists could manage something a little more significant than renaming buildings and stripping away murals as grounds for a university takeover.

Maybe that's a little unfair, though. Forgetting the first demand, which is at its root simply aesthetics, what about the other two? The second contains this humdinger of a line:
Learning about marginalized groups, their cultures, and structures of privilege is just as important as any science or quantitative reasoning course.
That's the kind of claim that I think far too many people will simple nod their heads in agreement with, because they want to seem enlightened and aware, or at least not catch heat for questioning it. But at best, it's a subjective claim, because what's important to some is not necessarily what's important to others. At worst, it's just not true. The point of getting a higher education is, of course, to increase one's knowledge base and to better prepare one for the future. Knowing more about marginalized groups can be a good thing, but it's hardly a primary thing, as compared to knowing basic scientific principles and knowing how to use basic mathematics to solve problems.

Regardless, here is the Social Analysis requirement at Princeton. There is more than enough room under this umbrella to accommodate students interested in studying marginalized groups. Mandating the requirement necessarily limits the options of students not interested in such courses. What's the point? The achieve a moment for self-congratulatory back-slapping because such a change was successfully forced down the throat of the school? Brilliant. And people wonder why students in the United Sates are lagging behind students in other advanced nations when it comes to skill sets and professional readiness.

The third demand is, in my opinion, the worst of the the three. It's essentially a demand for segregation, insofar as the student protesters expect the school to have a separate "cultural space" for Black students. It's a ridiculous demand, but one that will likely be filled by the administration, because it's easy to accomplish. And of course, to make sure that it can't be accused of playing favorites among "marginalized groups," Princeton is going to need "cultural spaces" for Asian students, Hispanic students, Native American students, and Pacific Islander students, not to mention ones for female students, homosexual students, transgender students, and of course dumbass students.

And note the importance attached to the naming of the space: it cannot be named after a white person or a person with bigoted beliefs (nevermind that pretty much everyone has bigoted beliefs in one way or another). Because apparently, naming is critical, naming is somehow definitional. In that regard, I am reminded of a tale about a Chinese emperor and the Confucian doctrine of the Rectification of Names. I've detailed it previously:
There's a great fable about a Chinese Emperor who was having problems with a particular river that kept flooding. The river was named "The Wild One." In order to combat the flooding problem, the Emperor had a most novel idea: he would rename the river "The Quiet One."

In the traditions of Chinese philosophy, this technique is a perversion of the Confucian doctrine, the Rectification of Names. Essentially, Confucius argued that it was of vital importance that names were "correct," that they carried the truth of what something was, when they were descriptive in any way (proper names obviously do not fit this bill). Thus, of a mountain were to be named "Long Mountain," it really should be long. And if an office were to be called "the office of bridge building," the office-holder really should be concerned with building bridges. Simply stuff, right?

The name-changing Chinese Emperor sought to "rectify" the thing, itself, by changing its name (instead of the other way around), a name that was properly descriptive.

Did it work? Of course not. The river's flooding was as bad as ever. Of course, we might speculate that travelers--looking at a map with a river called "the Quiet One" on it--felt more at ease during their journey...right up until the point that they drowned in a flash flood.
What's going on here is little different, from the demands to strip Wilson's name from the school, to the demand for changing requirements, to the demand for race-specific "cultural space." The students are ignorantly focused on changing labels as a means to change cultural. I would humbly suggest that they all need to keep at their studies, because so far their intellectual prowess isn't particularly impressive, no matter how noble their goals might be (or might not be).

Thursday, November 19, 2015

Terrorism and international football (soccer)

The November 13th terrorist attacks in Paris included several bombs that went off in the vicinity of the Stade de France, the football (soccer for Americans, but I'll use "football" in the remainder of this piece) stadium that was, at the time, hosting an international friendly between France and Germany. As has been well documented, the explosions could be heard in the stadium and on the pitch, French President Francois Hollande was in attendance there, and several French players had family members directly affected by the attacks (the cousin of one was killed in one of the shootings, the sister of another escaped from the Bataclan theatre).

There is little doubt that the bombings at the Stade de France were planned to coincide with the match. It's possible that there was even a built-in assassination attempt on Hollande, as well. Obviously, sporting events are soft targets for terrorist acts and ones that involve other nations can have an added impact, as the likelihood of victims from other nations increases.

So perhaps it would be wise to look at the schedules for international football, the international breaks mandated by FIFA, and take added precautions against terrorist attacks on the appropriate dates. Part of the reason I say this is that a major terrorist attack occurred on the previous FIFA international break, as well: the Ankara bombing on October 10th, 2015 in Turkey. In that incident, over one hundred people died. While no group has officially claimed responsibility, links to ISIS have been uncovered. And while the bombings were clearly directed at people involved in a peace rally, it is still nonetheless the case that there were international football matches on October 10th, and the days that followed.

Interestingly, no moment of silence was mandated at these matches following the Ankara bombings, a point that was not lost on the people of Turkey and on some in the media. In contrast, moments of silence for the victims in Paris were observed at every international match on the following days. And the English Premier League intends to have a moment of silence for Paris prior to every game in the upcoming week's matches.

But I digress. Again, there is some congruity here, with regard to these last two major terrorist attacks. And regardless of whether or not the timing of the Ankara bombings was coincidental with regard to the international break, there is little doubt that the timing of the Paris attacks revolved around the match that day at the Stade de France. Moving forward, on the horizon are the 2016 Euros, the UEFA European Championships. And the host country for this event? That's right, it's France.

The Euros are scheduled to be held from June to July throughout France, with matches in Paris (including the championship match, of course), Bordeaux, Lens, Lille, Lyon, Marseille, Nice, Saint-Denis, Saint-Étienne, and Toulouse. While the Euros are not the World Cup, they are still a significant event for Europe and of course for France (and carry a major economic component for the last).

The FFF (French Football Federation)—which is tasked with organizing the actual matches in France—is acutely aware of what these attacks mean for the 2016 Euros, insofar as security will have to be stepped up another notch. And that's good. But the question must be asked: will it be enough?

Of course, the counter to this is that the Euros must go on, that postponing them, moving them, or cancelling them is giving in to terror, is giving the terrorists exactly what they want. And that's true. Still, a successful terrorist attack—because the Stade de France bombings were actually not successful—at a major European football match is going to have serious consequences, ones that we can as of now only guess at. So before the Euros start, France and FIFA better be pretty damn sure that they have all of their bases covered.

Islamophobia and the titans of virtue in the media

It's now almost a week removed from the terrorist attacks in Paris on Friday, November 13th and the media is still ripe with coverage of the same, with details of follow-up investigations, with statements from political leaders across the globe, and with pieces discussing what the response will be, might be, and/or should be from France and other nations. But there are also many opinion pieces—and talking heads—concerned not so much with the attacks or what to do about, then with worrying about how the attacks are creating a "rising tide of Islamophobia."

My language choice is no accident here. Do a Google search for the specific phrase in quotes, "rising ride of Islamophobia," and see what you get. There are thousands of hits, dating back  to 2002. Take off the quotes and the number goes up almost tenfold. Various events across the last decade and a half have always been greeted by these kinds of proclamations. And by and large, the proclamations aren't coming from Muslims; they're coming from the self-appointed custodians of Justice who populate the media these days.

If there really had been all of these rising tides of Islamophobia in the United States, Europe, and elsewhere, we'd be underwater by now. The fact of the matter is, such a thing has never actually materialized in the past and it's unlikely to materialize now. This isn't to say there were no incidents directed at Muslims in the past, after various terror-events or the like, that there wasn't a noticeable increase in some anti-Islam expressions or the like. There most certainly was. And there most certainly will be again. And that's because people will often act without thinking, will lash out at convenient targets when they are upset and angry. What else is new? It's going to happen here, and then it's going to subside. Like it always does.

Babu form Seinfeld, Source: www.reddit.com
But there is a cottage industry in the media built around "tsk-tsking" everyone else. It's practitioners are less worried about Muslims per se than they are about presenting themselves as superior, more thoughtful people as compared to the majority of the hoi polloi out there. And unfortunately, there's a large audience for the screeds these sorts of people produce, made up of the same sorts of people, by and large, ones whose self-image is predicated on having someone to be better than, morally and ethically.

Discussions on the Syrian refugee issue have already summoned comparisons to Nazi Germany from this same crowd, along with the typical finger-wagging mockery of anyone who is on the wrong side of the issue (the current wrong side being the idea that's okay to limit, in any way, the influx of Syrian refugees). Slightly more sensible—at least on this issue—voices in the midst of these titans of justice at least recognize that such mockery and such extreme comparisons probably aren't a real good idea, politically speaking. From the usually imbecilic Kevin Drum at Mother Jones:
Mocking it is the worst thing we could do. It validates all the worst stereotypes about liberals that we put political correctness ahead of national security. It doesn't matter if that's right or wrong. Ordinary people see the refugees as a common sense thing to be concerned about. We shouldn't respond by essentially calling them idiots. That way lies electoral disaster.
Note however that Drum is not in any way allowing that there is any actual validity to the concerns over the refugees. It's just something "ordinary people" are concerned with, people who lack his and his cohorts' intelligence and moral certainty. Because after all, Drum and company are not "ordinary people," they're special. From the same piece:
Mocking Republicans over this—as liberals spent much of yesterday doing on my Twitter stream—seems absurdly out of touch to a lot of people. Not just wingnut tea partiers, either, but plenty of ordinary centrists too.
Note the implication: mocking Republicans over other things is okay, as long such mocking plays well with the "ordinary people," as long there's a good way to spin the mockery, to essentially get away with it.

And at its root, this is exactly the modus operandi for the crowd in the media looking to tsk-tsk everyone else. It's a contrived exercise that begins with an issue where a supposed moral high ground can be easily summited, one that even ordinary people can clearly see and—once properly instructed—will mount as well, or at least won't voice any disagreement (even when afflicted with some amount of uncertainty).

The claims about an increase in Islamophobia fit neatly into this same paradigm. After all, what right-thinking person could possibly object to someone pointing out this potential situation? What right-thinking person wouldn't be concerned? The reality is unimportant. Whether or not there is a rising tide of Islamophobia is inconsequential (there wasn't in the past and there is no evidence that there is now). All that matters, again, is being above others, is being able to criticize and mock freely, based on an assumed certainty of one's moral superiority. And that's what we're going to get, by and large, from the know-it-alls in the media going forward. Yippee.

Monday, November 16, 2015

The intellectual dishonesty of the media: Beinart and Chait on Rubio

During the Democratic Debate on Saturday night, John Dickerson had the following exchange with Hillary Clinton:
Dickerson: Marco Rubio, also running for president, said that this attack showed-- in-- the attack in Paris showed that we are at war with radical Islam. Do you agree with that characterization, radical Islam?

Clinton: I don't think we're at war with Islam. I don't think we at war with all Muslims. I think we're at war with jihadists who have--

Dickerson: Just to interrupt, he-- he didn't say all Muslims. He just said radical Islam. Is that a phrase you don't--

Clinton: I-- I think that you can-- you can talk about Islamists who-- clearly are also jihadists. But I think it's-- it-- it's not particularly helpful to make the case that-- Senator Sanders was just making that I agree with that we've gotta reach out to Muslim countries. We've gotta have them be part of our coalition.

If they hear people running for-- president who basically shortcut it to say we are somehow against Islam-- that was one of the real contributions-- despite all the other problems that George W. Bush made after 9/11 when he basically said after going to a mosque in Washington, "We are not at war with Islam or Muslims. We are at war with violent extremism. We are at war with people who use their religion for purposes of power and oppression." And yes, we are at war with those people that I don't want us to be painting with too brand a brush.
This is some ridiculous parsing of word choice by Clinton. Talking about "jihadists" is okay, talking about "Islamists" is okay. So is "violent extremism." But "radical Islam" (which everyone with a clue understands to mean the ideology justifying terrorism), somehow that's not okay, it's "painting with too broad a brush." I'd bet money that if Rubio had said "jihadists" instead, Cinton would still have said pretty much the same thing as above, tried to characterize the term as too broad, and likely have allowed that "radical Islam" would have been better, along with Islamists and violent extremism. Because this is a game for her: whatever terminology is employed by people on the Right is what she will take issue with, what she will criticize. And why? To paint the Right with too broad a brush, to imply that it is—as a whole—populated by xenophobes and bigots who cannot separate the wheat from the chaff, who think all Muslims are terrorists as a matter of course.

Such a tactic plays well with the far Left of course; this argument in one form or another has been a primary feature of its ideology for decades now, as it allows adherents to clap themselves on the back for not being the card-carrying racists/bigots/xenophobes that they portray their opponents to be. And it is hardly limited to politicos, as members of the media like Peter Beinart and Jonathan Chait happily play the same game. Writing about Rubio's recent comments in a Facebook video, both Beinart and Chait take Rubio to task for he said therein. Here is Beinhart:
The linguistic weirdness continues a couple of lines later. “This is not a geopolitical issue where they want to conquer territory and it’s two countries fighting against each other,” Rubio declared. “They literally want to overthrow our society and replace it with their radical, Sunni Islamic view of the future. This is not a grievance-based conflict. This is a clash of civilizations.” Notice that Rubio never explicitly defines who “they” are. According to the French government, the Islamic State perpetrated Friday’s attacks. Rubio, however, said what occurred in Paris is a “clash of civilizations"... 
The most straightforward way to interpret Rubio’s statement, therefore, is that the civilizational “they” that attacked Paris is Islam.
Beinart is arguing that Rubio doesn't define the "they" about whom he is speaking and that therefore Rubio must be speaking about Islam in general. Chait plays the exact same game in his article when he writes the following:
And Rubio has rushed out a new video in which he vaguely demands a “clash of civilizations.” Rubio plays it a bit coy, repeatedly describing the conflict as “them” and “us,” without specifying who is them and who is us.
See? The exact same argument about the exact same statement from Rubio (one has to wonder of Chait and Beinart hashed this all out together over drinks).

But here is the statement from Rubio in full that Beinart aind Chait are using as the basis of their argument (my boldface):
The attacks in Paris are a wake-up call. A wake-up call to the fact that what we're involved in now is a civilizational conflict with radical Islam. This is not a geopolitical conflict where they want to conquer territory and it's two countries fighting against each other. They literally want to overthrow our society and replace it with their radical Sunni Islamic view of the future.

This is not a grievance-based conflict. This is a clash of civilizations. For they do not hate us because we have military assets in the Middle East. They hate us because of our values. They hate us because young girls here go to school. They hate us because women drive. They hate us because we have freedom of speech, because we have diversity in our religious beliefs. They hate us because we’re a tolerant society.
So what the fuck are Beinart and Chait smoking? Their overt dishonesty here is as transparent as Clinton's. They both claim Rubio never defines the "they," yet Rubio clearly does exactly that, in the second sentence of his statement. The "they" is "radical Islam." It is defined from the beginning, yet neither Beinart nor Chait note this. Beinart, for his part, quotes the entirety of Rubio's statement—in snippets—in the course of his article, except for the second sentence. He never mentions it, obviously because any literate person would immediately see that Beinart's entire argument is built on a lie. Chait doesn't even bother to quote any of the statement. He just declares that Rubio never specifies a "them," no doubt trusting that his (Chait's) spoon-fed readership will just accept the claim as accurate.

And these two clowns are supposed to be the cream of the crop in media-land. They're supposed to be savvy and intelligent. Well, I guess maybe they are. But what they're not is honest...

Sunday, November 15, 2015

Cause and effect in the extremist world

Over one hundred and twenty-five people were killed last Friday night in Paris—so far, as there are many people who are critically injured still—from a series of coordinated terrorist attacks for which ISIS has now claimed responsibility. Whether or not the last is true is open to debate, of course, but there is little doubt that the attackers were Islamists. Passports found on two of the dead terrorists indicate they came from Syria and Egypt, respectively. And witness to the attacks claim that at least one of the terrorists spoke of revenge for Syria and another shouted the now-typical slogan of such people, "Allahu Akbar."

Almost immediately after the event, as many grieved and mourned the loss of innocent life and others simply tried to process the tragedy, the conversation about how the government of France should respond began, as is always the case after these kinds of incidents. Obviously, there are a lot of people calling for a response against ISIS, a military response. And such calls breed a predictable counter: attacks against ISIS, against Middle-Eastern or other Muslim countries only serve to breed more extremists, more terrorists.

It would seem to be an interesting conundrum: how can there be an effective response, one that doesn't lead to more people adopting extremist points of view? Indeed, even economic-style responses—like sanctions—can be viewed through such a prism: all they do is cause common people pain, thus leading to anger, desperation, and extremism. But it's only a conundrum if one accepts as true the premise: that reprisals breed more terrorists (to put it as simply as possible). And really, the premise is itself founded on an assumption, with regard to the way the world—reality, even—works: everything that happens is caused by something else.

Now, a discussion on theories of causality would be pretty heady stuff, when it's approached from a purely scientific standpoint. Philosophically it wouldn't be any easier. But for a taste of the last, consider an example: a kettle full of cold water is placed on a stove and the heat is turned on; after a time, the water begins to boil, steam escapes through the hole in the kettle's lid and produces a whistling sound. What caused the kettle to whistle? There is a series of events here that some might say represent a causal chain. At a molecular level the number of events is huge. But at a more superficial level, we can identify a fair number of events:
  1. Water from a water source is pumped through pipes leading to a sink near the stove.
  2. Someone holds a kettle under the water tap in the sink.
  3. Someone turns the faucet.
  4. A valve in the pipe opens, allowing the water flow to continue.
  5. The water exits the tap and enters the kettle.
  6. Someone places the kettle on the stovetop.
  7. Electricity (assuming an electric stove) is produced at a local power plant.
  8. Some of that electricity travels through power lines to reach the stove.
  9. Someone flips a switch on the stove.
  10. Electricity is released into a metal coil beneath the stove top.
  11. The coil's temperature increases.
  12. The temperature of the cooktop increases.
  13. The temperature of the kettle increases.
  14. The temperature of the water inside the kettle increases.
  15. The water begins to boil, liquid turns to gas.
  16. The gas (steam) escapes from the kettle through a small hole.
  17. The volume of steam in the kettles increases faster then it can escape, increasing the pressure.
  18. The kettle whistles.
That's eighteen distinct events, with plenty of other ones not noted between some of the above and prior to numbers one, two, seven, and eight (hell, the building of the stove is a necessary event, as is the evolution of life into tool-using primates capable of building the stove). And that's just to describe what caused a kettle to whistle.

Of course, one might say that a number of these events are largely inconsequential to the notion of cause, to describe what caused the kettle to whistle. For many of the events can be described as conditions, as opposed to causes. Within such a paradigm, what matters, what qualifies as causal, are those things necessary and sufficient to produce the whistling: boiling water that becomes steam. That is what caused the kettle to whistle. And we can say this primarily because the events occurred near each other, in both time and space, because the cause preceded the effect (a point surprisingly missed more often than one might think), and because we know the events are tied together.

David Hume, in his A Treatise of Human Nature, carefully spells out the Rules by "Which to Judge Causes and Effects" (part III, section XV):
Since therefore it is possible for all objects to become causes or effects to each other, it may be proper to fix some general rules, by which we may know when they really are so. 
(1) The cause and effect must be contiguous in space and time.
(2) The cause must be prior to the effect.
(3) There must be a constant union betwixt the cause and effect. It is chiefly this quality, that constitutes the relation.
(4) The same cause always produces the same effect, and the same effect never arises but from the same cause. This principle we derive from experience, and is the source of most of our philosophical reasonings. For when by any clear experiment we have discovered the causes or effects of any phaenomenon, we immediately extend our observation to every phenomenon of the same kind, without waiting for that constant repetition, from which the first idea of this relation is derived.
(5) There is another principle, which hangs upon this, viz. that where several different objects produce the same effect, it must be by means of some quality, which we discover to be common amongst them. For as like effects imply like causes, we must always ascribe the causation to the circumstance, wherein we discover the resemblance.
(6) The following principle is founded on the same reason. The difference in the effects of two resembling objects must proceed from that particular, in which they differ. For as like causes always produce like effects, when in any instance we find our expectation to be disappointed, we must conclude that this irregularity proceeds from some difference in the causes.
(7) When any object encreases or diminishes with the encrease or diminution of its cause, it is to be regarded as a compounded effect, derived from the union of the several different effects, which arise from the several different parts of the cause. The absence or presence of one part of the cause is here supposed to be always attended with the absence or presence of a proportionable part of the effect. This constant conjunction sufficiently proves, that the one part is the cause of the other. We must, however, beware not to draw such a conclusion from a few experiments. A certain degree of heat gives pleasure; if you diminish that heat, the pleasure diminishes; but it does not follow, that if you augment it beyond a certain degree, the pleasure will likewise augment; for we find that it degenerates into pain.
(8) The eighth and last rule I shall take notice of is, that an object, which exists for any time in its full perfection without any effect, is not the sole cause of that effect, but requires to be assisted by some other principle, which may forward its influence and operation. For as like effects necessarily follow from like causes, and in a contiguous time and place, their separation for a moment shews, that these causes are not compleat ones.
The first three are the ones we already noted as being necessary and sufficient in order to posit a cause and effect in our kettle example. And the eighth is something of an acknowledgement that there are always other actions or events that necessarily precede the cause being considered, like all of the other events that preceded the production of steam in our example. The fifth, sixth, and seventh rules are less about defining what constitutes a cause than they are about understanding the way things work. But the fourth rule, well that's pure empiricism from the father of modern empiricism. Hume is noting that what makes something a cause of something else is observation, not just of the event itself, but of the same or similar events over time. Without this repeatability, a supposed cause is just that: supposition. It's merely an observation that a given event followed another and an assumption that there must be a link between the two events.

Consider this: I heat a kettle of water on my stove. Just as the water begins to boil, my doorbell rings. Did the boiling water cause the doorbell to ring? Barring some kind of ingenuous steam-powered doorbell, probably not. I can say that of course because I know how the kettle works, I know what actions I took to make boiling water. And—more importantly—I've boiled water before and the doorbell didn't ring the other times. Nor did it ring when my mother boiled water when I was younger. But suppose it's the very first time I have ever used a kettle and a stove to heat water. Suppose I really don't know how anything works. I might conclude that maybe the boiling water did cause the doorbell to ring. And frankly, that's fair in the moment.

But—and here's the critical thing—it's a conclusion that can be and should be reevaluated as more data becomes available: I take the water off the stove but the doorbell rings again; much later—when I'm not even in the kitchen—the doorbell rings; the next day I heat water on the stove and the doorbell doesn't ring, etc. And in contrast to all of this, I might also notice that the kettle only whistles when it is full of water and has been heated for a time; it never whistles when it is empty, nor when it is not on the stove, nor when it is unheated. So my continued observations, perhaps coupled with a greater understanding of how things work, lead me to conclude that heating water in the kettle does not cause the doorbell to ring, but it does cause the kettle to whistle.

All of this cause and effect talk has so far been limited to inanimate objects, things which by their nature don't make choices. But people are apt to view human events through the basic idea of cause and effect, as well. We talk about the causes of the Great Depression, of World War II, of the Fall of the Iron Curtain, of great historical events, but also of more mundane, day to day things, like the causes of a drop in the DOW, of traffic jams, and of rising crime rates. Thus, cause and effect is an ever-present concept (which Hume understood as well) with regard to human nature. It is how we process the world, how we determine a course of action for ourselves or for larger organizations (if we happen to have authority in that regard), because we have certain expectations with regard to outcomes, based on choices made.

And hence the warning from people who fear that reprisals against terrorists orgs like ISIS will breed more extremism, will cause people to become terrorists. So it is only fair to ask the question: is this true? Do the actions taken by Western nations—of both military and economic natures—against other nations or groups cause people in those nations to become terrorists? Are there people who would have otherwise tried to simply lead their lives but who suddenly decide to take up arms (or bomb vests) because of some action taken by a nation like France, the United States, or England?

With respect to Hume's rules (which admittedly do not account for the issue of choice in the least), this postulation can be seen as consistent with the first three. The impact of a theoretical reprisal would be localized for those who would become terrorists, it would happen first, and the relationship between the reprisal and the decision of of the individual is easily identifiable and understood. Indeed, as was the case in Paris, terrorists sometimes specifically point to a previous action by a nation as the reason for their action.

But what about the fourth rule? Is it true? Allowing that reprisal-type actions can be followed by more people becoming terrorists, is this always the case?

The problem with making such a determination is the issue of choice. Some decisions/actions by governments or other large organizations (like multinational corporations or the Catholic Church) can I think be labeled causes or consequences because the decisions/actions potentially impact all people within the groups and limit or otherwise influence their choices. For instance, the ACA has had obvious consequences; it can fairly be said to be the cause of many, regardless of whether they are deemed positive or negative.

But a reprisal-type action, while it may impact a large group, perhaps even everyone within another nation—economic sanctions might fit this bill—a real limiting of choices to the extent that people will necessarily become extremists is just not a given. Arguing that this is the case is flawed first and foremost because the choice is rooted in emotion, not reason or practicality. And it was an available choice as a matter of course prior to the given reprisal-type action.

But I think more importantly, the whole "this will cause people to turn to terrorism" argument is fundamentally one that treats people as less than people, in their own right. It assumes that many people have limited choices not because of their situation but because of their very nature. It assumes that such people will become terrorists, that their emotional response and subsequent actions are not just predictable but are foregone conclusions.

And why? Not because of the horror of the reprisal-type action. The 9-11 bombings didn't turn a chunk of the New York City population into terrorists. And the attacks in Paris are unlikely to turn a chunk of the Parisian population into terrorists, either. The dirty little secret about people who make these kinds of arguments: they assume the "others"—the peoples of the non-western world—can only react to events, do not possess the ability to actually make their own individual choices. Their responses are conditioned, whether emotional or not, and therefore predictable. Thus, the "attacking them will only breed more terrorists" argument is ultimately grounded in a belief that these other "less civilized" people are less capable and shouldn't be treated as people in m their own right, insofar as they should not be held totally responsible for the choices they make. Really, the argument assumes they should be treated more like children.

People making the argument think they are being compassionate and thoughtful (I would argue that many make the argument specifically because they want others to see them as compassionate and thoughtful) when they are, paradoxically, robbing these other people of their humanity by treating them differently, by assuming they are just steam that just can't help but try to escape from the kettle when the heat is turned up. But they're not steam. They're people who get to make their own choices and have to live with those choices, just like everyone else.

Wednesday, November 11, 2015

Everything is bullying, except of course actual bullying

A lot has been going on at the University of Missouri of late. Student protests over unaddressed racism on the campus led to the resignations of the school's System President and its chancellor, which was followed by the appointment of an interim vice chancellor who is tasked with addressing the apparently significant problems with racism on the campus.

Then came the clash with "the media." Students on the campus had formed some kind of encampment and a freelance photographer working for ESPN—one Tim Tai, who is a student at the school, himself—was headed into the area to snap some pictures. The students there decided the area was a "safe space" and refused to allow Tai into the area, forming an impromptu wall to block him. They then berated him to leave, held their hands up to prevent him from snapping pictures, and at various points in time pushed him back. All the while, the involved students and what appears to be several faculty members as well made self-righteous and phony claims about Tai pushing them, touching them. It was an ugly scene. Watch it below:


There's a word for the students involved in this event: bullies. Because the word for their behavior is: bullying. There's no way around it. They used their numbers to take advantage of someone who was by himself, they sought to intimidate him, they belittled him, they mocked him, they eventually resorted to physical force and pushed him back. And through it all, he tried vainly to just do his job.

But looking around on the stories about all of this, pitifully few are using the "bullying" angle. Look at this thoughtful piece by Terrell Jermaine Starr at WaPo. From it:
Certainly, Tai – like any journalist – had a legal right to enter the space, given that it was in a public area. But that shouldn’t be the end of this story. We in the media have something important to learn from this unfortunate exchange. The protesters had a legitimate gripe: The black community distrusts the news media because it has failed to cover black pain fairly.
Allowing that he has a fair point, with regard to the black community and the media, that point doesn't justify bullying behavior in my opinion, especially given that the vast majority of the bullies were obviously not members of the black community (though maybe I missed the Rachel Dolenzes in the group). Starr would do well to note this usurpation of the cause by non-members. But he doesn't. He speaks throughout the piece about black students and their establishment of this "safe space," yet by and large, they are not the ones who instigated this incident. And again, he fails to note it for what it is: bullying.

And that is ultimately surprising to me, given how many stories about bullies and bullying behavior there have been across the last several years, how almost any behavior one could imagine was so classified. Yet when given an example of obvious bullying, people in the media seem blithely unaware of what is actually transpiring in an incident they are covering, are writing about. Given an opportunity to identify bullying, they collectively balked.

Tuesday, November 10, 2015

The rise of China: is the broken clock finally right?

Since the 1950's, there has been something of a cottage industry for many so-called experts on the global economy—economists, political scientists, and historians—who periodically warn the United States and "the West" about the "rising threat" of some country to the world economic order.

In the 1950's and early 1960's, this rising threat was the now-defunct Soviet Union. Supposedly, the Soviet economy was rolling along like a tank, with massive, record-setting yearly economic growth and it was poised to soon surpass the United States in virtually every economic indicator. Many of the experts here even traveled to the U.S.S.R. to witness the Soviet model firsthand, and they returned with stories of Soviet greatness, amazing tales of production and efficiency, and of worker contentment. Listening to these experts, one could hear the bell tolling for the United States of America and one could understand the calls to convert the U.S. to the Soviet model, to institute central economic planning and full-scale socialism, a command economy as it were.

The problem was, the Soviet model was all smoke and mirrors. It wasn't real. The numbers were all artificial, were either produced by inflating the real numbers or just manufactured out of thin air. Sure, there was some real growth, but it wasn't nearly as impressive as many seemed to believe. After the Wall fell and there was some access to real records from behind the Iron Curtain, this became readily apparent. Factories, for instance, had manufacturing quotas since the 1920's and these quotas were often met by simply producing defective or even inoperative equipment. Tractors from Soviet factories were shipped to farms throughout the land. And many of these tractors worked for a while—a week, a month, maybe a year—then broke down and were never repaired (no one was building spare parts). Many never worked from the get-go and just sat unused and rusting in barns and fields.

Of course, with this fabulous new equipment, farm workers were required to increase their production as well And they did by working themselves half to death, by shipping all of their foodstuffs to the point were local rural communities faced starvation, or by just lying about their production and hoping that they didn't get caught. This was not a new situation for the Soviets, as it had been happening since before World War II. The difference was that they had simply become better at the propaganda game and were more successful at limiting access. Yet to this day, there are still so-called scholars who trumpet the success of the Soviets in the 1950's, who argue that while there may have been some number fudging going on, the Soviet economy of the period can still rightly be called a marvel. Nonsense.

When the Soviet economy began to crumble in the 1970's because the Soviet leadership could no longer sustain the illusion, because it was steadily crippling itself with the arms race (thank goodness for President Carter, who helped the Soviets maintain the illusion a while longer), the Soviet model ceased to be the up and coming economy that would challenge the U.S. That mantle soon fell on Japan, whose massive manufacturing-based export economy (largely at the expense of the U.S. taxpayer, to be sure) was growing by leaps and bounds.

Anyone who was alive in the 1970s remembers the flood of Japanese imports, Datsuns, Toyotas, and Hondas on the road, all manner of Japanese electronics in the home, these were great times for Japanese companies and, to be fair, American consumers. Japan had the third largest economy in the world, going by GDP, in the 1970's and into the 1980's, when—despite a momentary slump—Japan's economic fortunes continued to grow because of the computer industry. And throughout this period, many of the experts were once again pushing an "end of U.S. dominance" narrative, even predicting that Japan would overtake the U.S. economy at some point in the near future.

The economic success of Japna was largely attributed to the Japanese economic model, with respect to its cultural component: the work ethic of the the people, the accountability of executives, etc. Indeed, movies like Gung Ho (Ron Howard's 1986 comedy staring Michael Keaton) were based on these assumptions and suggested that the U.S. should model itself after Japan if it wanted to save itself. But the end of the 1980's saw the bursting of the Japanese bubble, as artificially high real estate prices in Japan collapsed, robbing the markets there of investment monies, causing the Japanese yen to tumble against the dollar, and wiping away trillions in savings that the Japanese people had accumulated over decades. This led to the so-called "lost decade" of the 1990's for Japan and talk of its economic dominance dwindled.

But this gap—that of the needed economic superpower to challenge the United States—was soon filled, as the experts in the 1990's looked to the Asian Tigers and then the EU. But the Asian Tigers turned out to be paper tigers and the EU, well it is a formidable economy, but it has been and will always be something of a conglomerate, prone to infighting and subject to shocks resulting from problems within its member-states (i.e. Greece, Italy, etc.). Moreover, with respect to the EU's prospects for global dominance, it remains habitually limited by the demographics of its member-states, many of whom are aging and simply unable to grow significantly, points which again escaped the experts touting the EU.

Source: http://www.the-european.eu/
Since the 21st century began, there has been just one contender to speak of, one nation that has the hearts of the current experts all aflutter, has them projecting an end to the U.S. dominance once and for all. That nation is, of course, China. And there is no doubt, China's economy is huge. And it is still largely controlled by the state, thus returning us full circle to arguments for some amount of a command economy as a necessary component for the U.S., if it has any hope of staving off the rise of China.

Much is made of China's purchasing of U.S. debt as well, though more often than not the people worried about this don't really understand the issue, mistakenly believing China has loans to the U.S. that it could "call in" and cripple the U.S. economy at any moment. And recently, China has been selling off some of these holdings (which by the way shows the error of the above thinking: China must be able to find a buyer to divest itself of U.S. debt), causing some consternation, though the issue is really not all that significant, in my opinion. But it does point to the current interdependence between the U.S. and China, even as the latter preys on the fears of the U.S. public, plays games with its currency, and postures militarily on the international stage.

Lawrence Summers, from high up in his perch in the ivory tower, imagines that China's rise cannot be contained, that the U.S. must find a way to deal with the behemoth that is the Chinese economy, especially when that economy is going through difficult periods. And—like the experts of the 1950's—he knows this because he has visited China in person, has been shown its capabilities and desires by the Chinese leadership, begging the question: do we learn nothing from history?

China's economy is a pirate economy, a mercantilist economy, built on a single commodity: cheap labor. And it's not going to last in its current state. It's leadership is forced to use every available policy angle, from tariffs and currency manipulations, to maintain a steady growth rate, to hold back inflation, even as a small percentage of industrialists in China accumulate vast wealth on paper, wealth that will either flee China or risk being wiped away. Must the United States deal with China in the present? Absolutely. And the way to do that is to stop treating China with kid gloves, stop allowing a double standard when it comes to trade, intellectual property, and currency manipulation. Oddly enough, Donald Trump is the only presidential candidate willing to speak the plain truth here:
China’s economy is controlled by the government. Any notion that their economy is based on a free-market system is simply not true. If an American company wants access to the Chinese consumers, that company must share its intellectual property, a condition that violates international fair-trade standards, World Trade Organization rules and common sense.  
But the worst of China’s sins is not its theft of intellectual property. It is the wanton manipulation of China’s currency, robbing Americans of billions of dollars of capital and millions of jobs.
He's right. There is much to fear from China, but that is because we suffer these kinds of transgressions from China willingly, rather than demanding that China play by the same rules as everyone else. China's rise is fundamentally like that of the Soviet's in the 1950's. It's an illusion, sustained only by the gullibility of the rest of the world. True enough, there are resources in China aplenty (as there were in the Soviet Union), there is a huge potential market, and there is real growth. But there is also gamesmanship by the Chinese government that sustains the illusion of China's greatness while simultaneously limiting it's real potential and the prosperity of its people.

So no, the broken clock is still not right. We are not on the cusp of a new era under a new economic superpower. Unless, of course, we're dumb enough to go along with the games.