Friday, December 27, 2013

The Rape of Africa continues unabated, Part I

As frequent readers of this blog might know, I make it a point to stay abreast of what is happening in many African nations, mostly because I feel too little attention is paid by much of the media and the public in this regard. What follows here is the first of a three part series on the state of Africa--mostly sub-Saharan--in general.
Several days ago, I wrote about the continuing conflicts in both the Republic of the Sudan and the Republic of South Sudan. The latter--South Sudan--achieved political independence just a few years ago, in 2011. That independence came as a result of a peace agreement reached in 2005 to end a lengthy civil war in the Sudan between government forces and the SPLA--Sudan People's Liberation Army (an org I explored at length in the above piece)--that had spanned over twenty years.

The agreement granted the southern regions of the Sudan political autonomy for six years, to be followed by a referendum on the issue of independence. The vote proved to be one-sided in the extreme, as over 98% of all votes cast were in favor of independence for the south. And so the Republic of South Sudan was born, even while many large-scale and small-scale conflicts continued within the region and within the Sudan as a whole.

Many people believed--and still believe--that this was a critical event, however, not just for South Sudan but for Africa as a whole, for it represented the often-cited Right of Self-Determination all peoples should supposedly have in the minds of some. Within an African nation like the Sudan, this was particularly noteworthy, given the history of the European Colonialism on the continent and the resulting arbitrary boundaries of the various countries within.

At the same time, there is the underlying issue of natural resources. The Sudan--the pre-2011 version of the nation--is home to significant oil reserves. Unfortunately for the Republic of the Sudan, over three quarters of these reserves lie in the south, in what is now South Sudan. Needless to say, there are those who believe that the treaty ending the civil war in the Sudan--which led to the creation of the southern republic--was engineered by outside forces in order to get easy access to that oil, along with other resources in the south like uranium and gold.

The fringier types attribute this all to a Rothchilds/Jewish kind of conspiracy, an idea seemingly buttressed by Israeli support for the southern nation. Others simply chalk it up to business as usual for the developed world, where Africa still represents a more or less free-for-all grab-bag of natural resources, where war--especially civil war--is a means to an end in this regard (along with being an excellent way to maintain a market in military equipment and services.

To this latter point, over the past fifty or so years, so-called "developing" nations in Africa have incurred a great deal of debt (to put it mildly), mostly for the solidifying of successive regimes' hold on power. In the case of the Sudan, prior to the 2011 split, it had amassed a foreign debt (real, actual debt from loans floated to the government) of over $35 billion. Since the split, the debt has increased substantially (with at least another $10 billion for the northern republic and even more for the south). Much of it is owed directly to the Paris Club (which may very well write most of it off down the road), the rest to other private concerns.

The total foreign debt for both nations (they have yet to reach an agreement on how to divvy up the pre-2011 debt) stands at around 85% of their combined GDP. Now, when compared to the debt to GDP ratio for many "first world" nations, this may not seem terribly problematic. The difference is, these developing nations have limited--severely limited--tools available to them for paying even the debt service on a year to year basis. Moreover, much of the GDP in nations like both the Republic of the Sudan and South Sudan is simply not remaining "in house" as it were. This kind of debt is a perpetual burden for these nations, especially given the near-continuous conflicts occurring in both, as their governments are simply unable to make consistent payments to creditors (which means the debt goes up, even without new loans).

And again, the monies loaned were often spent not on vital infrastructure or services (which could lead to real economic growth) but on military supplies and the like. It is, on the whole, an ugly and depressing pattern and not limited to the two Sudans in the least.

Despite all of this, despite the continued violence within both Sudans, there has been actual economic growth in these nations--particularly the north--since around 2005. But is such growth sustainable? Look at the GDP over time for the Republic of the Sudan, the Republic of South Sudan, and of the sub-Saharan nations bordering both:

At first glance, this chart seems to represent real progress, as apart from Eritrea and the CAR there are--at the very least--upward trends in GDP for these nations, significant ones for the Sudan, Ethiopia, and Kenya. However, look at the following chart, which indicates the flow of aid--in constant US dollars--to these same nations:

That's an awful lot of money, year in and year-out, going to these nations, dating back to the post World War II years. Total it up and there is more than enough--in most cases--to achieve significant improvements in things like infrastructure and standards of living.

And then look at this chart, which shows net inflows (new investment inflows less disinvestment) of foreign monies in constant US dollars (there is no data available for South Sudan here):

Since 2000, that's a lot of money going in, especially when we factor in the profit-taking that has been occurring.

What all this indicates is that the supposed growth in GDP for these nations is something of a mirage. Taking out the monies spent on militaries, on humanitarian aid, and that introduced from private concerns in order to access resources like oil leaves these nations with little to show for all of the dollars that have poured into them since the end of the Cold War. And as we know, there remains widespread violence, human right violations, and rampant corruption in all of them.

This is not evidence of growing economies, of rising standards of living in the least in my opinion. Rather it is evidence of an ongoing mercantilistic approach to this portion of Africa. An analysis of other sub-Saharan states would look similar, with a few exceptions (most notably South Africa).

Yet, Africa is the birthplace of mankind, of society, of civilization in general. Why is it that--as a continent--Africa appears to consistently trail the remainder of the world in economic development, in standards of living, and in so many other metrics, from education to health? These questions we will take up in Part II of this series.

Cheers, all.

Monday, December 23, 2013

Climate change discussions: no heretics allowed!

I enjoy a good discussion. Ask anybody. And I appreciate people who are passionate about their point of view, can articulate it clearly, and can defend it with strong arguments and actual evidence. When such discussions involve political issues or contentious issues, things can of course get very heated. I don't have a problem with that; in fact, I actually look forward to it. Because nothing is better for separating the wheat from the chaff than a little emotion, a little display of anger.

One such issue that leads to serious heat--no pun intended--is that of climate change, of global warming, or of whatever-the-term-du-jour-is for this phenomenon. I've opined previously on climate change, both with regard to "black swans" and the supposed "anti-intellectualism" of those who question the conclusions of some (most?) experts in the field. From the first:
What needs to be understand in the context of Climate Change and Black Swans is that the realm of unpredictability is not limited to the natural climate, alone. It extends to the consequences of the solutions offered by those predicting future cataclysms. Such solutions--intended to "fix" the problem--can have unpredictable consequences for both the climate and mankind, as a whole. What are these consequences? We don't know. That's the point.

Experts tell that a failure to act, to make massive changes to the nature of global society, will have disastrous consequences because of what their models tell them about the future climate, but they tell us nothing about what the results will be from following such a course of action. Setting aside their consistent failings to get the future right when it comes to such forecasting and their inability to explain these failings, what they never explore is the potential impact of trying to manage--from on high--an open, complex, and largely organic system: the world's economy, the total civilization of mankind.
From the second:
Any disagreement with the self-proclaimed experts can earn one the "climate change denier" label. And there's not much to do about that. So what happens? People that recognize the dangers of assuming man can predict the future of the global climate and can control that future really have no toe-hold in the debate. They may know that "hoax" is an improper characterization, but may also recognize that it's the best they can hope for, with regard to some measure of the population.

The question is, does that feed anti-intellectualism? I guess the best way to answer that is with a comparison. Consider the minimum wage. Anyone that understands economics--even on a basic level, even according to a flawed traditional paradigm--knows that increasing the minimum wage will have one demonstrable effect: it will tend to decrease employment (raise the unemployment rate). There's really no argument to be had there. Even Paul Krugman knows this. Yet, politicians on the left (Democrats) have no problem championing the idea of increasing the minimum wage and dismissing the one tangible consequence as not true, unproven, or not absolute. Why? Because it helps with their overall agenda.
I want to make my own position crystal clear, here: I do not "deny" the idea of climate change, at all. I take it as a given that climate change is occurring (and always has been, and always will). And because mankind is an active part of the world's ecological system, mankind's activities impact the climate, impact its future. I'm all for studying this, of understanding--to the best degree possible--what the consequences are for various activities engaged in by man, with respect to the eco-system in general and the climate in particular. So when it comes to the science of climate change, I'm okay with it being pursued, I think it should be pursued.

That said, I do not accept any absolutes in this regard, I do not accept the idea--now hammered home on an almost daily basis--that the science of climate change is "settled." It's not. It can't be, because the future is not settled and the systems in question here--the global climate and the world's eco-system--are open, complex systems. So when Climate Scientist A predicts the Arctic will be ice-free by 2013, I don't accept this as "settled science." I understand it for what it is: a prediction based on assumptions that may or may not be completely accurate.

Sunday, December 22, 2013

As Christmas approaches, Sudan still in flames

There is little peace in  the Sudan. The so-called War in Darfur which began in 2003 has never really ended. At the same time, other conflicts continue in the southern regions of the Republic of the Sudan, crossing over into the Republic of South Sudan (which became an independent state in 2011). Only a few days ago, a UN compound in South Sudan was overrun by gunmen--reportedly from the Nuer--resulting in the deaths of local civilians and foreign peacekeepers, alike.

Located in Bor--right in the middle of South Sudan--the UN base was being used to shelter people displaced by violence elsewhere in the nation. At the same time, violence and unrest has continued in Juba, the nation's capital, located to the south of Bor. The ethnic lines of battle in these regions are very clear: the Dinka versus the Nuer. The nation's army, the Sudan People's Liberation Army (SPLA), is dominated by the Dinka, having been founded by members of the same--and other enthnic groups from the north--way back in 1983 (then a rebel army that used mostly guerrilla tactics). When the South achieved independence, the great majority of it was under the direct control of SPLA leaders. As such, the SPLA became fully legitmate as the force responsible for keeping the peace in South Sudan.

One of the many steps it has taken in this regard is the disarming of other groups and civilians. But the Nuer have always resisted such disarmament, fearing--quite rightly, in my opinion--that this would effectively lead to their own elimination and/or subjugation by the Dinka. Back in 2006, elements of the Nuer community formed their own rebel group, the Nuer White Army, to resit the SPLA. This group was put down very harshly, mostly by punishing the Nuer community as a whole, by the SPLA. And now, with this latest outburst of violence from the Nuer, the SPLA has responded in a horribly violent yet entirely predictable fashion:
Since clashes broke out in Juba on Sunday fighting has occurred in half of the country‘s ten states. Victims and witnesses told the New York-based monitor, Human Rights Watch, that government soldiers and police have been interrogating people on the street in Juba about their ethnicity and deliberately shooting ethnic Nuer.
This is Christmas in the Republic of the South Sudan.

Things in the northern nation, the Republic of the Sudan, are not much better. Aside from the continuing violence in Darfur, there is a new hotbed of violence and war: the Nuba Mountains in South Kordofan, just north of the border from the southern Republic. Here, the Sudanese Army is attempting to eliminate--or at least control--the Sudan People's Liberation Movement--North. This is the very same org that created the SPLA and achieved independence for the South. When that happened, the remnants of the group in the north continued on.

Saturday, December 21, 2013

Top Five All-Time Greatest Christmas Songs

Yes, I know exactly what opinions are like. And yes, these kinds of lists are a dime a dozen, with one being no more authoritative than the next. Nontheless, here are the five best songs for the Christmas season, in my not so humble opinion:
5. Do They Know it's Christmas--Band Aid
In 1984, Irish musicians Midge Ure (of momentary Thin Lizzy fame) and Bob Geldorf (from The Boomtown Rats) were determined to help with the famine currently taking place in Ethiopia. To that end, they founded Band Aid, a kind of supergroup that attracted the support and participation of a number of high profile musicians from the era, from Phil Collins, to Boy George, to Bono. Ure and Geldorf composed a song, which was then recorded by this supergroup released on November 28th, 1984. Here it is:

4. O Come, O Come, Emmanuel--Traditional Choir 
A traditional Christian hymn whose origins extend back to at least the 12th century (maybe much earlier), it was originally written and performed in Latin (Veni, veni, Emmanuel). The modern version was authored by John Mason Neale and Henry Sloane Coffin in the middle of the 19th century. Deeply spiritual, the beautiful music that accompanies the hymn is unmistakable from its very first bars. No Christmas is complete without this classic. Enjoy:

3. White Christmas--Bing Crosby 
Written by Irving Berlin in 1940--accounts vary with regard to the exact date and place--this song became an instant classic. It is easily the most covered holiday song of all times, as every artist who puts out a holiday album tends to include their own version of this song, and the best-selling single ever released, having sold some 50 million copies (100 million if all cover versions released as singles are included). And it also happens to be my wife's favorite Christmas song. Sit back and remember fondly Christmases gone by:

2. Silent Night--Vienna Boys Choir
Many people will probably be surprised to learn just how long this classic song has been around. The original was composed by Franz Xaver Gruber (melody) and Joseph Mohr (lyrics) in 1818 not far from Salzburg, Austria. Originally in German (as Stille Nacht, heilige Nacht), the song was translated into English by John Freeman Young in 1859. But I've opted for the original version, performed here by the Vienna Boys Choir in 1967:

1. I Believe in Father Christmas--Greg Lake 
For as long as I can remember, this has been my favorite Christmas song. Written by Greg Lake and Peter Sinfield in 1974, Lake released the song that same year as a single, though he was still actively a member of Emerson, Lake, and Palmer. Eventually, ELP included the song on their Works Volume 2 album. The song is at once joyful and remorseful; it reminds us of both the true meaning of Christmas and of innocence lost. The haunting acoustical introduction stays with us, long after the song is over, as do the words from the song's second to last verse:

I wish you a hopeful Christmas
I wish you a brave New Year
All anguish pain and sadness
Leave your heart and let your road be clear
Merry Christmas and happy holidays to all.

Ducks, chicks, and the speed of Time on the internet

The Duck Dynasty "controversy" remains in full speed ahead mode. People and pundits on the Right are continuing to stir up outrage over the treatment of Duck Dynasty patriarch Phil Robertson by the A&E network (which airs the show). They are trying very hard to frame this all as some sort of First Amendment issue, suggesting--nay, insisting--that Robertson's right to express his views is being trampled in the name of "political correctness" or the like. And in response, people and pundits on the Left are arguing--quite rightly--that A&E is fully within its rights to take the actions is has taken, that Robertson is not being denied his right to express his views and opinions, at all.

I've already addressed this at length in a previous piece. I noted that politicians and pundits on the Right who are trying to make hay with this issue probably know they're being inconsistent, if not dishonest, with their arguments. For their hypocrisy is easy to demonstrate. My overall conclusions:
Most likely, Robertson and Duck Dynasty will not suffer much from any of this. Hell, ratings will probably increase, given how much people like controversy, manufactured or otherwise. And those on the Right who leaped to Robertson's defense will probably get away with their hypocrisy, by and large. But see, that's the problem: this was an opportunity to demonstrate that conservatism proper can be separated from such ignorance, an opportunity to actually better the image, as opposed to just maintaining the status quo. And it was missed. Like far too many other ones have been.
Really, I didn't think there'd be any reason for me to revisit this issue. Jindal, Cruz, Palin, and others were steadily making bigger and bigger asses of themselves. All the Left needed to do was to keep feeding them rope; there'd have been a hanging soon enough. But I guess I gave the Left too much credit, or at least failed to appreciate just how stupid social media makes people. Because the current popular response from the Left--on Facebook, Google+, and elsewhere--is the creation, sharing, and "liking" of posters comparing the Duck Dynasty situation to that of the Dixie Chicks. A sample from Google+:

Generally, the posters come with some sort of dig at the Right for their flip-flopping here. And frankly, in and of itself, that's completely fair. The problem is, most of the people distributing and sharing these posters were probably part of the "outrage train" back in 2003/2004, were railing against the treatment of the Dixie Chicks after Natalie Maines opened her mouth and inserted her foot. To put it another way, these posters serve to highlight the hypocrisy of both groups involved in these dust-ups.

When the Dixie Chicks were taking it on the chin back in those days, when stores were choosing to not sell their albums, when radio stations were refusing to play their music, people and pundits on the Left were beside themselves with anger, were insisting that the Dixie Chicks' right to express their views was being trampled. And as far as I know, the people making these arguments never relented. So in effect, the message from these posters should be "the Right is as wrong now as the Left was then."

Friday, December 20, 2013

Obamacare officially declared a "hardship" by HHS

Last night, yet another "rule change" came down, with regard to the implementation of the Patient Protection and Affordable Care Act, i.e. Obamacre. This latest change was undertaken at the behest of a group of Democratic Senators in order--in theory--to help those people who have seen their plans cancelled and are faced with only more expensive options through the exchanges. According to the law, the great majority of Americans are required to have health insurance coverage by the New Year, with a deadline of December 23rd for enrolling through the exchanges if they are not yet covered. This is the much-vaunted (and much-criticized) individual mandate that is the very basis of Obamacare, the thing that allows it to work.

But that mandate is becoming more and more meaningless. The Administration had already made numerous changes to the law, relaxed standards and extended deadlines, in the face of harsh criticism from the public since the big rollout of the website. This latest move, however, may effectively lead to the collapse of Obamacare, for the Administration--through HHS--is essentially allowing that the law itself, Obamacare, constitutes an actual hardship that can be used as an excuse to avoid the individual mandate. Let's delve into the specifics. This is the memo sent out by HHS defining the rule change. I quote:
The Affordable Care Act provides many new consumer protections. In some instances, health insurance issuers in the individual and small group markets are cancelling policies that do not include the new protections for policy or plan years beginning in 2014. Because some consumers were finding other coverage options to be more expensive than their cancelled plans or policies, President Obama announced a transition period allowing for the renewal of cancelled plans and policies between January 1 and October 1, 2014, under certain circumstances. Some states have adopted the transitional policy, enabling health insurance issuers to renew their existing plans and policies. Some health insurance issuers are not renewing cancelled plans or policies.

To ensure that consumers whose policies are being cancelled are able to keep affordable health insurance coverage, we are reminding consumers in the individual market of the many options already available to them, and we are clarifying another option for consumers in the individual market...

If you have been notified that your individual market policy will not be renewed, you will be eligible for a hardship exemption and will be able to enroll in catastrophic coverage. If you believe that the plan options available in the Marketplace in your area are more expensive than your cancelled health insurance policy, you will be eligible for catastrophic coverage if it is available in your area. In order to purchase this catastrophic coverage, you need to complete a hardship exemption form, and indicate that your current health insurance policy is being cancelled and you consider other available policies unaffordable. You will then need to submit the following items to an issuer offering catastrophic coverage in your area: (1) the hardship exemption form; and (2) supporting documentation indicating that your previous policy was cancelled.
Follow that? If you had your policy cancelled by an insurance company--for whatever reason--you can
get a hardship exemption. Period. All you need to do is fill out the paperwork and send it in with a copy of the cancellation letter. You don't need to demonstrate that the available options are more expensive than your old policy or anything like that. And you're not actually required to buy the "catastrophic coverage" at all, but are simply allowed to do so if you wish, once the hardship exemption has been rubber stamped. There's no mechanism beyond this, so in effect people who receive the hardship exemption under this rule change can simply not carry any health insurance. Of course, if they get sick, they can immediately go out and buy coverage.

Thursday, December 19, 2013

Duck crap: when pandering overrides common sense

For those unaware of what Duck Dynasty is, it's one of those psuedo-reality TV shows that happens to air on the A&E network. It has become very, very popular. The show centers on a family in Louisiana--the Robertson's--who made it big manufacturing duck calls and other hunting equipment. In essence, it's a kind of "look at these hicks who have more money than they should" show. In the way of a full disclaimer, I have to admit that I have never watched this show, and I never will.

As a matter of course, "reality television" holds no interest for me whatsoever. I've never watched Jersey Shore, Real Housewives, the Ozzy Osbourne show, the Kardashian one, or any of the others. They are--all of them--fundamentally dishonest and stupid, in my opinion. The popularity of this stuff is without a doubt one of the biggest indicators of our steadily decaying culture. That said, if someone wants to watch theses shows--and clearly many do--who am I to judge? But viewers should be honest about this. What they are watching is no different than professional wrestling. All of these shows are staged, all are planned to provoke various emotions from viewers. What makes it different from pro wrestling is that the primary emotion that is sought is schadenfreude, without a doubt. The networks broadcasting these shows want viewers to feel superior to the cast members, a point probably lost on a number of cast members in various shows, though certainly not on all.

So what's going on in Duck Dynasty that has so many people worked up? Well, the family patriarch, one Phil Robertson, gave an interview to GQ for its January issue, an interview now online. In it, Robertson says some stupid--to put it mildly--things. Like this, for instance:
“It seems like, to me, a vagina—as a man—would be more desirable than a man’s anus. That’s just me. I’m just thinking: There’s more there! She’s got more to offer. I mean, come on, dudes! You know what I’m saying? But hey, sin: It’s not logical, my man. It’s just not logical.”
And this (in response to the question "What is sinful?"):
“Start with homosexual behavior and just morph out from there. Bestiality, sleeping around with this woman and that woman and that woman and those men,” he says. Then he paraphrases Corinthians: “Don’t be deceived. Neither the adulterers, the idolaters, the male prostitutes, the homosexual offenders, the greedy, the drunkards, the slanderers, the swindlers—they won’t inherit the kingdom of God. Don’t deceive yourself. It’s not right.”
And also this:
“I never, with my eyes, saw the mistreatment of any black person. Not once. Where we lived was all farmers. The blacks worked for the farmers. I hoed cotton with them. I’m with the blacks, because we’re white trash. We’re going across the field.... They’re singing and happy. I never heard one of them, one black person, say, ‘I tell you what: These doggone white people’—not a word!... Pre-entitlement, pre-welfare, you say: Were they happy? They were godly; they were happy; no one was singing the blues.”
Needless to say, all of these comments have created something of a backlash, enough of one that A&E has decided to suspend Phil Robertson from the show indefinitely:
In a statement, A+E Networks said, "We are extremely disappointed to have read Phil Robertson's comments in GQ, which are based on his own personal beliefs and are not reflected in the series 'Duck Dynasty.' His personal views in no way reflect those of A+E Networks, who have always been strong supporters and champions of the LGBT community. The networks has placed Phil under hiatus from filming indefinitely."

Deep Purple: okay, now it's personal

This year's inductees--as performers--into the self-proclaimed Rock and Roll Hall of Fame are as follows:
Peter Gabriel
Daryl Hall and John Oates (Hall and Oates)
Linda Ronstadt
Cat Stevens
As it happens, this list includes my all-time favorite female singer, Ms. Linda Ronstadt. And she is most assuredly a deserving choice, in my humble opinion. As to the other choices, let me just say this: blah.

I get why each was nominated, I really do. Hall and Oates were a hit-machine in the late 70's and early 80's (interestingly enough, their first single--"She's Gone"--peaked at number sixty in 1974; I think it's their best, by far). Peter Gabriel is a serious talent, who had substantial success both as a member of Genesis and as a solo artist (but let's get real: he was better within Genesis than without). Nirvana, despite releasing only three studio albums, were huge in the moment, launched grunge rock into the stratosphere, and opened the minds of record company execs to the possibility of commercial success with alternative rock. Cat Stevens is a brilliant song-writer, whose socially conscious--yet very successful--work has earned him a place among the all time greats in this arena (how much of a "rock and roller" he is, though, is a very valid question).

Then there's Kiss. Kiss, Kiss, Kiss. I remember Kiss from back in the day. I had friends who were members of the "Kiss Army." I remember their big hits, from "Beth" to "Rock And Roll All Nite" ("Detroit Rock City" is my personal fave). But what I remember most about Kiss is seeing those four solo albums--using each member's face in full make-up as a cover--sitting in the 99 cents bin at Woolworth's for what seemed like practically forever. Because what Kiss really was--more than anything else--was an exercise in marketing. Every possible angle--at the time--was taken in this regard, all in an effort to win the support mostly of twelve to fifteen year-old boys. Tee-shirts, special clubs, other paraphernalia, limitless albums, goofy movies (Kiss Meets the Phantom? Really?), whatever it took to make money and make Kiss appear to be bigger and more significant than they ever really were.

Don't get me wrong, Kiss is not some terrible band at all. But neither are they on the same level as what I would call the truly great bands and performers in Rock and Roll history. To be fair, though, the whole "alternate persona" deal and elaborate stage set-ups were novel things, even groundbreaking to some extent, so there is definitely an argument to be had in favor of Kiss being inducted into to the Hall of Fame. Just as there is for the other inductees in this year's class. But there are also very strong arguments for not including any of them in the Rock and Roll Hall of Fame (with the obvious exception of Linda Ronstadt). And I'm going to give the best argument possible in this regard: none of these acts deserves to be in the Hall of Fame because Deep Purple is not in the Hall of Fame.

For a while now--years actually--Kiss fans have been lamenting the lack of the band's inclusion in the Hall, supposing that this was somehow a purposeful slight on the part of the selection committee. And you know what? It probably was. But so what? Again, we're not talking about a band whose music still dominates on classic rock radio stations (really, it never did). Take a look at this list of acts who are not in the Rock and Roll Hall of Fame (sorted by years eligible). Kiss went through 14 years of eligibility before being selected. Yet, bands like Steppenwolf, the Moody Blues, and Jethro Tull have all been eligible for more than twenty years. How can anyone--who has a clue--complain that Kiss has been singled out and treated badly here? Are they bigger, more significant to rock history than the three above? How about the Steve Miller Band? Or Chicago? Really, how can Hall and Oates get in when Chicago can't?

But the biggest one of all, the actual, real, honest-to-goodness-snub in the Rock and Roll Hall of Fame is Deep Purple. Who cares if Kiss, Peter Gabriel, or Cat Stevens is in or out as long as Deep Purple remains on the outside looking in? Led Zeppelin is in (1995). So is Black Sabbath (2006). Deep Purple is one of the most influential hard rock bands of all times. Heavy metal was more or less defined as a genre in full when Deep Purple released Machine Head in March of 1972.

It's tough now to understood just how significant this album was. Initially, only one single--"Never Before"--was released from the album, yet the album remained on the charts for over two years. Why? Because of the other tracks on the album, particularly "Smoke on the Water" (eventually released as a single). Most were just too long, in the minds of the record company execs, to work as singles. But the songs were burning up FM radio, heavily requested and played for years and years. "Smoke on the Water" went on to become one of the most-played songs on FM rock stations in the decade. It became so well known that it--along with "Stairway to Heaven"--gained a reputation as the first song one had to learn on the guitar. People who cannot otherwise play the instrument often know the first chords of "Smoke on the Water."

Wednesday, December 18, 2013

The Coburn Wastebook--2013 Edition--and why no one cares

Every year, for the past four years anyway, Senator Tom Coburn  (a Republican from Oklahoma) compiles and releases a report he calls "Wastebook." Essentially, this report highlights one hundred examples of terribly wasteful spending by the Federal Government. Coburn pulls no punches, either. Party tags don't matter a whit in his report. He even points to specific acts of Congress and the Administration that contribute to or even directly create this waste.

Coburn's Wastebook tends to get very little fanfare upon its release. There are but a few stories on it. Right now, a Google news search of "Coburn" and "Wastebook" yields less than one hundred recent stories. A full Google search for the same--with no time limit--yields a measly 17,000 hits, many of which are just shares of the above news stories. Really, the Wastebook is practically ignored, much like the various efforts of Congresspersons like Senator Coburn and Senator Jeff Flake (R-Arizona) to curb waste and earmarks on an almost daily basis. But before going farther here, let's first take a hard look at this year's Wastebook.

The stories out there on the Wastebook cite some specifics from the report to provide a feel for the stuff Coburn is talking about--and I am going to do the same--but I encourage everyone to actually read the whole report, cover to cover. It's about 130 pages, minus the footnotes (Coburn has cites for everything). Why? Because the variety is significant; it's important to understand that this problem--government waste--is truly endemic and spans practically all government agencies as a matter of course.

Now that you're done reading, here are some of my favs:
16. Money-Losing Sugar Loans Leave Taxpayers With Bitter Taste–(USDA) $171.5 million

When Americans borrow money from banks, they are usually also required to pay them back with money. When U.S. sugar producers borrow money from the taxpayer, however, they can pay it back with sugar.

It’s all part of a convoluted, money-losing scheme to sweeten sugar producers’ bottom lines – known as the U.S. Sugar Program.

In 2013 alone, the government lost $171.5 million because sugar companies could not pay back the government for money it borrowed.

The 2008 Farm Bill created the Feedstock Flexibility Program (FFP) to increase the use of ethanol and biofuels. Under this program, the government is required, in times of surplus, to buy sugar from processors and to re-sell the sugar to ethanol plants. Since the 2012-2013 sugar harvest season is the first to yield a surplus, taxpayers are witnessing the program’s wastefulness for the first time. In August, the first use of the FFP, the USDA bought only 7,118 tons out of 100,000 tons of sugar offered for resale.

USDA then sold this sugar to an ethanol maker at a $2.7 million loss.

In its second purchase, USDA paid $65.9 million for 136,026 tons of sugar, and then sold it to ethanol makers for $12.6 million-a $53.3 million loss. Facing a global surplus of sugar for the foreseeable future, the Congressional Budget Office forecasted the FFP to cost taxpayers at least $239 million over the next ten years.
So how does the USDA end up with this surplus sugar to sell at a loss to ethanol makers? Because the USDA has lent over $1.2 billion to sugar processors in exchange for sugar in collateral. In a move to protect U.S. sugar processors from foreign competition, USDA disperses loans for price support to ensure that U.S. domestic sugar prices are higher than the global markets and that Big Sugar keeps bringing in big profits. In the 2012-2013 season, 20,000 sugar farmers received $1.7 billion in net gains.

Instead of repaying the USDA with cash from their profits, U.S. sugar processors and producers are actually defaulting on their loans and forfeiting the sugar they put up as collateral. American Crystal Sugar, the leading American sugar processor with 15.1% of the market, defaulted on its loan of $71.2 million, which is one-fifth of the government loans held by sugar processors who may also default. While defaulting on a loan has serious financial consequences for the American taxpayer, American Crystal’s President and CEO, David Berg, called it “beneficial to [American Crystal’s] financial health,” and “the way the sugar program is intended to work.”
American Crystal is not the only sugar processor not paying its bill to American taxpayers. Earlier in September 2013, USDA accepted 85,000 tons of sugar as payment for a loan due in August.234 Although USDA swapped the sugar for import credits, the government had to swallow the $34.6 million cost of the loan. As of September 30, 2013, 20 percent of the USDA loans, over $233 million, to U.S. sugar processors were outstanding.
Got that? Instead of paying back the loans--which these companies could easily do--they intentionally default on them, which allows them to "forfeit" their collateral: actual sugar. And the government accepts this collateral at a high value, then sells it for a huge loss, which of course means the collateral was intentionally hugely overvalued. It's crazy. And we--the taxpayers--are on the hook for all of it.

Tuesday, December 17, 2013

The EPA: a wretched hive of scum and villainy?

Maybe that's a little harsh. Maybe. Or maybe it's not harsh enough. Let's look at some of the facts.

Just recently, EPA front man for Global Warming and imaginary secret agent John C. Beale returned to the news cycle, as he is due to be sentenced today for stealing some $900,000 of taxpayer funds over the course of several years. Beale had woven a grandiose tale wherein he was supposedly performing clandestine work for the CIA and using his EPA position as cover (methinks he watched Confessions of a Dangerous Mind a few times too many). He flew all over the world--first class, of course--and stayed at five-star hotels when he was supposed to be doing work for the EPA. And if he wasn't out of town, he pretended to be and spent days at a time relaxing at home while e-mailing greetings to his colleagues from far away places like Pakistan. His fraud was so great that there was a period of at least a year and half where Beale admits that he did "absolutely no work."

The EPA's Office of the Inspector General opened an investigation into Beale in February and the lead investigator, Assistant Inspector General Patrick Sullivan, began to quickly and easily uncover the fraud perpetrated by Beale. The current head of the EPA, Gina McCarthy, is desperately trying to get credit for the takedown of Beale, with her press secretary noting that the fraud was "uncovered" by McCarthy first. That's a hoot. For Beale was reporting directly to McCarthy--when she was the EPA's Assistant Administrator--throughout the period when he perpetrated his fraud.

Understand that what we're talking about here is a Federal agency with a budget. Beale worked for the EPA, that's who paid him. Yet he took trips supposedly in service to the CIA, trips that were funded in full (even when they didn't happen) by the EPA. Someone had to sign off on this and that someone was McCarthy. Frankly, this just isn't done without some inter-agency meetings and massive amounts of paperwork. Indeed, we now know that as early as 2010, red flags were going up in the EPA's HR department, red flags that McCarthy obviously ignored. Maybe she was taken in by Beale's lies, maybe she's just naive, but either way she's incompetent at the very least. Yet she now sits at the head of the EPA.

But then again, look who she replaced: Lisa Jackson, or should I say "Richard Windsor"? Remember that story? Ms. Jackson had established a secondary e-mail account under the above name in order to communicate with certain other EPA colleagues. Such a move was strictly a no-no, according to government regulations. Amazingly, her alias actually won an ethics award at the EPA, an event eerily reminiscent of the Captain Tuttle storyline from M*A*S*H. And as an Obama appointee, Ms. Jackson kinda blew the whole "most transparent Administration in history" tagline out of the water. She ultimately resigned as the above investigation started to gain traction, but claimed it was for unrelated reasons. Yeah, sure.

NSA metadata program likely violates Fourth Amendment

That is the key finding in Judge Richard Leon's 68-page ruling on Klayman v. Obama. Judge Leon did not, however, take any action in this regard, as he expects the government to appeal the ruling to the D.C. Court of Appeals (and from there, the case is likely to go to the Supreme Court, whether or not Leon's ruling is upheld). Nonetheless, critics if the NSA program--on the Left and Right--are thrilled with the ruling; they see it as a big win by and large, despite the fact that the program will continue unabated, at least for a while.

Really, Judge Leon's opinion here was a study in judicial restraint, something that should rightly earn him a great deal of praise. He ruled only on what was directly before him, limited himself to the Fourth Amendment issue, and ultimately found that the plaintiff's case was so strong in this regard as to make it unnecessary to go beyond it. For make no mistake, there are still a number of other arguments as to why the NSA has overstepped its authority. But with regard to the Fourth, Judge Leon concludes with the following:
For reasons I have already discussed at length, I find that plaintiffs have a very significant expectation of privacy in an aggregated collection of their telephone metadata covering the last five years, and the NSA’s Bulk Telephony Metadata Program significantly intrudes on that expectation. Whether the program violates the Fourth Amendment will therefore turn on "the nature and immediacy of the government's concerns and the efficacy of the [search] in meeting them."
He goes on to note that the government failed to demonstrate how the NSA program is either effective or necessary as a means of meeting such concerns, that there is no evidence to show the collection of metadata has had significant benefits for NSA operations in the least.

In terms if these specifics alone, it's a good ruling, a very strong ruling. Again, Judge Leon shows significant restraint here. He ruled on only the (what appears to be) obvious violations of the Fourth, granted an injunction that would put a halt to the NSA program, but stayed the order pending the obvious appeal. It's excellent judicial work, a point driven home by the extent of the analysis in his filed opinion. For while he rules only on a limited basis, Judge Leon delves deep into the specifics of the case and establishes a number of other bases for future legal actions and decisions that will likely be touched on again in the Court of Appeals or the Supreme Court as this case moves forward.

The Three that will always Be

Three days ago on December 14th, 2013, Peter O'Toole passed away at the age of 81. One of the true giants in a world of giants, O'Toole is still best remembered for one role: that of T.E. Lawrence in the 1962 epic masterpiece Lawrence of Arabia.

It is not enough to call this film a classic; it is far more than that, in my not-so-humble opinion. It is a big, beautiful, and bold work of art that does more than just tell a story. It captures a moment of history, reveals how tragic and lonely the human condition can be, and beyond and above everything else, Lawrence of Arabia reminds us that Fate is nether kind nor gentle, but often cruel.

And while one can point to the greatness of the film's cinematography, of its direction, script, and score, it was Peter O'Toole who made all of this matter, who breathed life into a role and commanded the attention of all who watched him, from the beginning to theend. This is a feat of artistic greatness rarely--if ever--equaled. Put it all together and what you get is a film routinely recognized as one of the greatest ever made. In 2007, the American Film Institute (AFI) ranked it as the seventh greatest film of all time, behind Citizen Kane, The Godfather, Casablanca, Raging Bull, Singin' in the Rain, and Gone with the Wind. In 1998, it had been ranked number five, but somehow fell down two spots in favor of Singin' in the Rain (number ten in 1998) and Raging Bull (number twenty-four in 1998).

Personally, I think both lists are wrong. I think Lawrence of Arabia is one of the three greatest movies ever made and I cannot--in good conscience--rank it above or below the other two in this group. What are the other two? Well, none of them are in AFI's top ten, or even the top fifty. As of 2007, one of them comes in at number one hundred on that list, the other is not there at all. The three greatest movies of all times, in no particular order:
Lawrence of Arabia (1962), directed by David Lean and starring Peter O'Toole
Ben-Hur (1959), directed by William Wyler and starring Charlton Heston
The Magnificent Seven (1960), directed by John Sturges and starring Yul Brynner
No doubt, my top three will not sit well with many, many people. I have no Casablanca, no Citizen Kane, and no Gone with the Wind, it is true. And all three of these other movies are exceptional, no doubt. But they fail to move me, to the extent that my chosen three do. It's really not even close, in this regard. For if I had to go deeper, to pick a fourth or fifth, I would be hard-pressed to not include The Searchers (the 1956 western starring John Wayne and directed by John Ford) and The Bridge on the River Kwai (from 1957, starring William Holden and also directed by David Lean). Such is the way with movies, I guess, everyone has their own preferences and opinions.

Still, I cannot fathom how the greatness of Lawrence can be so obviously seen by some who apparently close their minds when watching Ben-Hur and The Magnificent Seven. For like Lawrence of Arabia, these other two films are big and bold, to say the least. I think they are beautiful as well, though I fear the beauty of the American West is lost on far too many "experts" here.

More importantly, all three films allow for the primacy of Fate, despite the heroic actions by characters within. All three conclude with less then glorious moments, if not quite tragic ones. Watching these films, we have heroes to cheer for, to bring us to our collective feet, yet are forced to sit as quickly as we had risen when reality rushes back in the very next scene.

Then of course there is the acting. O'Toole's greatness here is again indisputable. But then, so is Heston's in Ben-Hur and Brynner's in The Magnificent Seven (with Steve McQueen keeping pace, no doubt). It is no accident that these three films appeared so close in time, either. For the three starring actors are of that golden age of when acting was both a vocation and an artistic pursuit for its own sake (to be fair, there are still such people today, but too few of them in my opinion).

Saturday, December 14, 2013

The real legacies of the Obama Administration

We, as a nation, remember political history in chunks based on the identity of the Chief Executive. Even though some things that happen are not necessarily directly attributable to a given President, he or she tends to get the credit or the blame in this regard. And one of the reasons--really, the principle reason--we remember political history in this manner is because that's the way it is taught, from elementary school to graduate school. To be sure, there are a handful of exceptions (like Reconstruction) but for the most part, political history is taught as a series of Presidents. Really, history in general is taught this way, and not just with regard to the United States.

Despite the "great man" theory of history having fallen out of favor in academia (that's the idea where history can be understood as the consequences of actions taken by larger-than-life figures), the course of history is charted under such a rubric, more often than not. It's a somewhat understandable---if sometimes insufficient--shortcut, for it allows "periods" in history, either with regard to the rule of a single leader or a dynasty.

Looking at just the history of the United States, moments--or even extended periods--when this rubric is found wanting are easy enough to spot. For instance, there is the westward expansion of the 19th century, which spanned over fifty years and the administrations of some 13 or so Presidents, of various political parties and various points of view on this expansion. But there are also more limited events that occurred during a single administration, or perhaps two or three, that had very little to do with the actions or lack thereof of the President or the Federal Government. Like the Dust Bowl of the 1930s. True enough, there was a response to these calamities from the Feds, but one cannot fairly attribute the actual events to the Administration of the moment or to a previous one.

All of that said, there are without a doubt "legacies" that can be pinned to political leaders, at local, state, and Federal levels. And necessarily, the ones that have the widest impact are the ones centered around the most powerful political figure in the nation: the President.

Some Presidents have very clear and very favorable legacies, like George Washington and Abraham Lincoln. The former established standards for Presidential behavior that remain to this day (and are rarely met). The latter held the nation together (by force of arms, it is true), brought slavery to an end, and set the stage for the United States to become the most powerful nation in the world. Others have legacies that are contentious, like FDR, Andrew Jackson, and Thomas Jefferson, with some thinking very highly of what these men accomplished and others thinking their accomplishments are overstated, to say the least.

Then, of course, there are those who simply did little of value, whose legacies amount to nothing if they are lucky or to a series of bad decisions sometimes having severe consequences. This group includes Andrew Johnson, Warren Harding, and James Buchanan. Harding's Presidency was basically one scandal after another and Johnson's legacy is that of an obnoxious Southerner who stood in the way--whenever possible--of attempts to recover from the Civil War and to protect the rights of newly emancipated blacks in the South. Buchanan, whom I happen to view as the worst President in the history of the United States, left a legacy so terrible it is difficult to fully appreciate today. Opportunities to stave off secession were missed horribly by Buchanan; he was truly a man of inaction, especially as compared to Lincoln or Jackson in this regard. And he foolishly tried to halt the westward expansion, which in and of itself only contributed to the unrest in the nation.

Most Presidents have varied legacies; the periods when they held office have things that are remembered positively and things that are remembered negatively. Consider Bill Clinton. The Clinton years are remembered for the economic prosperity of the times, for the sexually-charged scandals that impacted the office of the President, and for the general growth of partisanship in politics and the media. Clinton himself is credited and blamed for all of these things, as well as being remembered for a handful of specific statements and actions that are used to define him, most often by those who supported him.

Really, the legacy of Ronald Reagan is not all that different, except the big scandal of that era was the Iran-Contra Affair. The economic bounce-back from the Nixon and Carter years remains the most critical element of the legacy, though it is very closely followed by the end of the Cold War. And Reagan is--to this day--one of the most heavily quoted Presidents in the modern era, again mostly by those who share his politics.

To be sure, there are also a number of specific laws/polices linked to these Presidents, as there are to most others. For Clinton, there is the aborted "Hillarycare" fiasco, the Glass-Steagall bill, welfare reform legislation, and "Don't Ask, Don't Tell."  For Reagan, there are his the so-called "War on Drugs" policies, his response to the Air Traffic Controllers strike, and "Reaganomics."

With all this in mind, it is perhaps time to start looking at Obama's potential legacy, given that the remainder of Obama's second term is looking more and more like it will be something of lame duck period (there's a very good chance Obama will be dealing with a fully Republican Congress after the 2014 Elections). In a year-by-year tally, one could spend pages and pages listing things about the last five years, but one could have done that with Clinton, Reagan, or any other President as well. What I want to do is separate the wheat from the chaff, as it were, pinpoint some very specific things that will be associated with President Obama's time in the White House.

First, there is the economy. Regardless of what one thinks about blame here, the economy has been weak during the last five years. Even some improvement in the next three will do little to change the historical perception in this regard. The period will be remembered as one with a sub-par economy, a recession that began before Obama took office but one his policies failed to impact substantially, one way or the other.

Next, there is the issue of foreign affairs. While the legacy of Obama's predecessor has certainly been found--and will likely continue to be found--wanting in this arena, Obama has done little to distinguish himself. The death of Osama bin Laden, while big news in the moment, will not echo through history in the least, partly because the Obama Administration had already successfully downplayed al Qaeda's significance prior to that moment. The real kicker for Obama is the so-called Arab Spring, which became much more of a winter in short order. The turbulence in the coming years in this region of the world will fall on Obama's shoulders, by and large, just as the conditions in Iraq and Afghanistan continue to fall on those of George W. Bush.

From the standpoint of policy and scandal, there is a lot to choose from. But suffice it to say, I do not think any of the scandals that have plagued the Obama administration--from Fast and Furious to the IRS to Benghazi--will be a part of his legacy. Rather, he will be remembered for two things: Obamacare (which could be very bad or very good for Obama) and the growth of an Imperial Presidency. Obama has taken new liberties with regard to Presidential power; whether one supports him or not, this reality cannot be denied. And like all things in American politics, his actions will be used as precedents by future leaders.

Lastly, there are the things Obama has said, the general demeanor and attitude of both him and other politicians (both those supporting him and opposing him), that will impact his legacy significantly. And no doubt, Obama's skin color will play a role here. But I think that--over time--Obama's legacy here will largely be about the reactionary movements that were spawned and rose to prominence under his watch. Thus is the stuff that sticks to leaders, because it's the stuff that really defines the times. Reagan looked good in this regard, thanks to the Reagan Democrats. But so did Clinton, in a very different way. Obama? The jury is still out here, but depending on 2014 and 2016, he may be held responsible for a significant period of change in U.S. politics. At least that's what I'm hoping.

Cheers, all.

Friday, December 13, 2013

Pope Francis and Corporatism: the real font of Catholic social justice

In the wake of Time Magazine naming Pope Francis its "Person of the Year," I thought it might be apropos to revisit his recently published apostolic exhortation, Evangelii Gaudium (The Joy of the Gospel). This transmission from the hand of the Pope spends a lot of time discussing what he sees as critical issues with regard to the pastoral duties of the Church, but it also delves into political economy, free markets, and the increasing amount of inequality in the world (at least in the opinion of Pope Francis).

This latter portion of the Evangelii Gaudium has garnered a great deal of attention, to say the least. It has been greeted with joy from many progressives and liberals, as they see it as essentially an anti-capitalist screed, a call to normalize society through the promotion of real social justice, with--apparently--the Catholic Church prepared to lead the way. And in that same light, many conservatives and libertarians see it similarly, except they are none too happy with it, supposing--to put it mildly--that the Pope doesn't know what the hell he is talking about because he is a card-carrying Marxist/communist/socialist.

In fact, I think he completely understands what he is saying and I don't think he's preaching Marxism, socialism, or communism at all. Instead, the Pope is very much speaking from the point of view of one immersed in an ideology spawned among the intellectual leaders of the 19th century Catholic Church: the ideology of Corporatism.

This term--Corporatism--is fraught with perils, mostly because it is now commonly used to label aspects of the current world economic order, almost always incorrectly. Understand that Corporatism proper has nothing to do with modern corporations at all, neither how they function, nor their dependence on or independence from the state. The confusion in this regard--all too common throughout the internet--is largely due to the similarity of the two words: Corporatism and corporation. Both have the same root word, the Latin corpus meaning body, but that's about it.

Corporatism actually refers to an economic (and political) system wherein the people in a society are organized into various groups, based on what they do, on how they make a living. The underlying idea here--and the reason for the name--is that society should be viewed as an organic whole, like a living organism or body, with every person having a distinct role to play in order for society to properly function, to metaphorically live and grow. Thus, one segment of the population should never be--figuratively or literally--under the heel of any other segment. None have primacy in this regard, except of course for the state itself, which is tasked with leadership and control (more or less the head of the body).

These ideas were mostly a product of the Catholic intelligentsia in Continental Europe during the nineteenth century and represent a reaction to the growth of free market economies based on the ideas of Adam Smith and other English thinkers. At the same time, Corporatism was also a counter to the radicalism of Marxist-based ideologies that was threatening the traditional social end economic order in Europe, especially with regard to the role of the Catholic Church. For while there is a strong syndicalist aspect to Corporatism, there is nothing in it that suggests abolishing the idea of private property, much less the absolute leveling of society.

In fact, Corporatism was--when it was initially articulated--very much a conservative ideology; those who fashioned it still looked to the economic institutions of the Middle Ages, the guild systems and feudalism, as their inspirations. They opposed wide open free trade and free markets because they assumed greed would dictate activity, first and foremost. Thus, the Corporatist system was designed to both maintain the status quo and protect those segments of society in need of such protection. Here is a good overview of Corporatism that includes a list of modern regimes that might properly be labelled as essentially Corporatist:

Thursday, December 12, 2013

CATO turns Google Evil (with some help from Mercatus)

Not all that long ago, Google was the champion of the Left, when it came to all sorts of issues. I remember how--under the Bush Administration--Google was garnering non-stop praise from the Left for resisting Federal subpoenas for the search data of its users. Remember that? AOL, MSN, and Yahoo! had all gone along with the requests and provided the data, but Google refused to do so. Interestingly enough, this was so-called "metadata" the Feds were after, as specific user details related to searches were not requested. And of course, the current administration, Dems in Congress, and their Left-wing fanboys in the media are apparently fine with the government compiling metadata.

But then--in 2006--Google was the champ, the defender of the righteous, the technology giant that was not Microsoft, was not AOL, was somehow not even a real corporation (because it wasn't evil). In those days it was still hip to talk about Google, to be a shareholder after its 2004 IPO. Despite its immense size and wealth, Google was still one of the Good Guys, though there were some rumblings growing already.

And that's the way of things for much of the Left. The self-congratulatory narcissistic elitism that is so critical to the modern liberal and progressive world views thrives on the notions of intellectual and cultural superiority. Thus, members of this group endeavor to be on the cutting edge of all things, of technology, of art, of music, of lifestyles, you name it. And when something trendy becomes overly popular, the same people who were so proud of their participation in the beginning of a movement are quick to dump the same and look for something new and less popular to latch onto (while mocking those who are still into it). If too many people catch on to something, it's just not as much fun to be a fan or a proponent, apparently.

I observed this attitude long ago, back in my younger days, particularly when it came to things of an artistic nature. The whole "college band/college rock" movement--that was really the genesis of alt rock--is the quintessential example. Being a fan of one or more bands who were so characterized was a mark of distinction, it made one feel special and "in the know." But when such a band became hugely popular--as was the case for U2, REM, and others--many in the initial core group of fans criticized the bands for "going commercial," criticized the newer fans for being Johnny-come-latelies and not really understanding the music, or both. Similar attitudes are also readily apparent in the film and fashion industries.

I always thought these to be strange reactions. And note, they weren't true across the board. Some people were thrilled to death when the band, filmmaker, or style they loved got more attention. They didn't have a problem at all with the increased popularity; they welcomed it. But in my experience, such people were usually a minority, sometimes a severe one (the more "counter-cultural" the thing, the more protection it received from its elitist fan-base).

The technology boom that started in the eighties has chartered a similar course. For a long time, Apple played the part of revolutionary to Microsoft's role as the commercially successful "sell-out" (which I always found to be a bit ironic, given Rush Limbaugh's early, vocal, and consistent support of Apple). And as the internet grew and blossomed, the browser wars began, where one's choice of browsers was actually deemed by many to say something significant about oneself. It was--and I guess still is--a matter of conscience for many on the Left to have an alternative to IE. Personally, I use Chrome. But that's just because I like it better, because I find it to be a cleaner experience. My choice is simply my choice; it indicates nothing else about me, in my opinion.

When it comes to other technology choices, the pattern is much the same. When Google launched as a search engine, it was initially a small player in the field. Since it grew out of a college environment, that was were it's usage first exploded (much like Facebook). Its superior techniques led to rapid growth in the market. Its competitors--like Yahoo!, Lycos, AltaVista, and InfoSeek--were left in the dust for the most part. Microsoft launched MSN Search the same year as Google. It has been rebranded several times and now exists as Bing. Despite attempts to challenge it, Google is--far and away--the dominant search engine by far. The money raked in by Google through advertising dollars and its IPO have allowed it to expand into many other areas of technology in general and internet-related activity in particular.

Monday, December 9, 2013

Movie review: Out of the Furnace

Last night, I took my 13 year-old son to see Out of the Furnace, the new Christian Bale movie about a steel-worker's hard luck life in western Pennsylvania, Braddock to be precise. Bale's character, one Russell Baze, is a mill worker whose father is bed-ridden and slowly dying (we're not told the exact nature of the sickness), whose mother died sometime in the past, and whose younger brother Rodney (played by Casey Affleck) is in the army and sees several tours of duty in Iraq.

Rodney has some problems and--later in the film--some serious demons. He's lost, unable to find a place to fit in, and turns to bare-knuckle fighting as a means of paying off his debts. In that regard, he does business with local bar owner and underworld-type figure John Petty (played by Willem Dafoe). Eventually, this brings him into contact with Curtis DeGroat (played by Woody Harrelson), a ruthless drug-dealer and all-around criminal who runs a backwoods region in the Ramapo Mountains (in New Jersey).

Russell seems a decent sort of guy, hardworking and responsible with a girlfriend (Zoe Saldana) who is looking to settle down and start a family. He helps his brother with the latter's debts whenever he can and helps care for his ailing father (with the help of his Uncle Red, played by Sam Sheppard). But he is--from the beginning of the film, which is actually around 2008--more or less trapped in a life with limits, in a town slowly dying. That's when he does something terrible, though not intentional, that results in his incarceration for--apparently--five years or so.

When he gets out of prison, he's lost his girl to another man (the Braddock chief of police, played by Forest Whitaker), his father has died, and his brother has gone from bad to worse, is full of rage and seems to be suffering from PTSD or the like. But he pushes forward, tries to reclaim his life in a town that is now almost dead. He returns to work at the steel mill, fully aware that its time is almost over, that it will be shutting down forever in the not-too-distant future.

It is all terribly heart-wrenching, terribly tragic, and terribly real. And things continue to go downhill from here. Rodney's problems escalate; he travels with Petty to fight for DeGroat in the Jersey backlands and neither return. The remainder of the film revolves around Russell's search for his brother and his quest for vengeance in this regard (with the help of Red). Things don't end well, to put it mildly.

It is a dark, violent film. But the violence is neither of the comic book kind or of the action-hero kind. It offers little in the way of humor or hope and is--in may ways--the perfect film for ruining one's holiday spirits. Critics are divided on its merits, as this compilation at Rotten Tomatoes indicates. But those panning the movie are, in my opinion, way off base. Out of the Furnace is an immersion kind of event. One needs to get lost in the film, take it all in and allow it to settle deeply within. It has rightly been compared to The Deer Hunter. Some scenes even appear to be cinematic homages to the 1978 Cimino classic. But it also evokes the same kind of reaction--in me, at least--as Paul Schrader's 1997 film Affliction, staring Nick Nolte.

These kinds of movies are draining experiences, if viewed properly, if allowed to fully permeate one's psyche and spirit. And they speak to the human condition, to the reality of life often being hard and full of pain. My son and I watched the film in a half-empty theater (despite this being the opening weekend for the film), and everyone was dead silent from beginning to end. There was no cross-talk, no outbursts of any kind. When the film ended, few rose immediately from their seats, almost everyone remained silent and seated as the credits rolled, slowly digesting what they had just taken in, coming to terms with Russell's life and the realities we sometimes forget about beyond out own gilded cages.

If Out of the Furnace had been made ten or more years ago, I have no doubt that critics would have been hailing it as a major achievement. But in today's world, it is fashionable to be haughty, to suppose a movie needs to rise in service to one's own needs. If The Deer Hunter were released today, I think the same critics panning Out of Furnace would be similarly unmoved by the former. And that's a real shame. Out of the Furnace is not an easy film to watch, but I'm glad I did. I'm particularly happy that my son watched it with me, appreciated it in full, and declared it to be a great movie, even as he looked distraught and had been quite horrified at various moments during the experience.

As to the "critical" aspects of the film, the acting was superb throughout. Casey Affleck was particularly brilliant and Woody Harrelson gave an Oscar-worthy performance in my opinion. The cinematography and direction were both quite good, as was the musical score (which included some Eddie Vedder).

I know this kind of movie is not for everyone. But for those that can stomach the pain, know that it is a brilliant piece of film-making. Pay no heed to the critics. I predict that down the road--five or ten years from now--Out of the Furnace will be more fairly judged, will endure as a classic. So go see it, if you can stand losing your holiday cheer for an evening.

Cheers, all.

Saturday, December 7, 2013

World Cup 2014: draw, brackets, analysis, and predictions

The final draw for the 2014 World Cup--to be held in Brazil next summer--took place yesterday. The groups are now set and are as follows:
Group A: Brazil, Croatia, Mexico, Cameroon
Group B: Spain, Netherlands, Chile, Australia
Group C: Colombia, Greece, Côte d’Ivoire, Japan
Group D: Uruguay, Costa Rica, England, Italy
Group E: Switzerland, Ecuador, France, Honduras
Group F: Argentina, Bosnia-Herzegovina, Iran, Nigeria
Group G: Germany, Portugal, Ghana, USA
Group H: Belgium, Algeria, Russia, Korea Republic
Most of the immediate analysis with regard to the strengths of the various groups uses the FIFA World Rankings, averaging them for the members of each group to get comparable numbers in this regard. The averages for each group, strongest to weakest (with country rankings in parentheses):
1) 11.25, Group G: Germany(2), Portugal(5), Ghana(24), USA(14)
2) 14.25, Group D: Uruguay(6), Costa Rica(31), England(13), Italy(7)
3) 20.25, Group C: Colombia(4), Greece(12), Côte d’Ivoire(17), Japan(48)
4) 21.00, Group B: Spain(1), Netherlands(9), Chile(15), Australia(59)
5) 22.75, Group E: Switzerland(8), Ecuador(23), France(19), Honduras(41)
6) 24.25, Group A: Brazil(10), Croatia(16), Mexico(20), Cameroon(51)
7) 26.25, Group F: Argentina(3), Bosnia-Herzegovina(21), Iran(45), Nigeria(36)
8) 28.25, Group H: Belgium(12), Algeria(26), Russia(22), Korea Republic(54)
Thus, the United States finds itself in this year's "Group of Death." And our friends across the pond--the English--are in nearly as bad a situation. Meanwhile, Argentina--the number three team in the world--has the second easiest group, according to this methodology. But I'm going to tweak the numbers just a bit, by looking at group strengths with the top team in each group taken out of the calculations, then with the bottom team in each group taken out. Why? Because the top teams are all expected to go through (with the possible exception of the Swiss)and some of the bottom teams are viewed as little more than cannon fodder. So, here are my numbers:
Minus the top team in each:
1) 14.33, Group G: Germany(2), Portugal(5), Ghana(24), USA(14)
2) 17.00, Group D: Uruguay(6), Costa Rica(31), England(13), Italy(7)
3) 25.67, Group C: Colombia(4), Greece(12), Côte d’Ivoire(17), Japan(48)
4) 27.67, Group B: Spain(1), Netherlands(9), Chile(15), Australia(59)
5) 27.67, Group E: Switzerland(8), Ecuador(23), France(19), Honduras(41)
6) 29.00, Group A: Brazil(10), Croatia(16), Mexico(20), Cameroon(51)
7) 34.00, Group F: Argentina(3), Bosnia-Herzegovina(21), Iran(45), Nigeria(36)
8) 34.00, Group H: Belgium(12), Algeria(26), Russia(22), Korea Republic(54)
Minus the bottom team in each:
1) 7.00, Group G: Germany(2), Portugal(5), Ghana(24), USA(14)
2) 8.67, Group D: Uruguay(6), Costa Rica(31), England(13), Italy(7)
3) 11.00, Group C: Colombia(4), Greece(12), Côte d’Ivoire(17), Japan(48)
4) 11.00, Group B: Spain(1), Netherlands(9), Chile(15), Australia(59)
5) 15.33, Group A: Brazil(10), Croatia(16), Mexico(20), Cameroon(51)
6) 16.67, Group E: Switzerland(8), Ecuador(23), France(19), Honduras(41)
7) 20.00, Group F: Argentina(3), Bosnia-Herzegovina(21), Iran(45), Nigeria(36)
8) 20.00, Group H: Belgium(12), Algeria(26), Russia(22), Korea Republic(54)
The orders in both cases are unchanged, with the exception of groups A and E, which swapped places in the second set of averages. And these manipulations--again, based only on FIFA's rankings--suggest that the initial order of relative group strength is fair. But here's the thing: the point of the group stage is to get out of the group. What really matters is the strength of each team, relative to the rest of its group.

Look at Group B. If we allow that Spain--the defending champion and the top team in the world right now--wins the group, which teams have a real chance of finishing second, of getting on to the next stage? Chile, despite being no push over, is faced with a daunting task, for the Netherlands is always better then the group average in all three instances.

Thursday, December 5, 2013

Inequality, upward mobility, and fairy tales

Yesterday, the President spoke at length on the subject of the economy in general, and on income inequality and upward mobility in particular. He told some stories about U.S. history, detailed how the federal government had saved people time and again from the robber barons of the past, from the evils of the world, from themselves even. He bragged again on how successful his administration has been at creating jobs, at creating economic growth, and went on and one about the issue of education and the various initiatives on the table in that regard. Oh, and he also repeated his cock-and-bull defense of Obamacare, how it's a "market based" solution to the problem of healthcare (even though it's not actually about healthcare, at all).

All in all, President Obama provided a great deal of meat in this speech that one--like me--might take issue with, might endeavor to expose as the nonsense that it is. But I'd like to focus on one passage of the remarks, a portion about the idea of social mobility and how--supposedly--we have a problem in this regard:
The problem is that alongside increased inequality, we’ve seen diminished levels of upward mobility in recent years. A child born in the top 20 percent has about a 2-in-3 chance of staying at or near the top. A child born into the bottom 20 percent has a less than 1-in-20 shot at making it to the top. He’s 10 times likelier to stay where he is. In fact, statistics show not only that our levels of income inequality rank near countries like Jamaica and Argentina, but that it is harder today for a child born here in America to improve her station in life than it is for children in most of our wealthy allies -- countries like Canada or Germany or France. They have greater mobility than we do, not less.
I don't know where Obama got these figures, I don't know if they are accurate, but let's just assume that they are. Because guess what? They're not all that surprising or unusual. And despite Obama's claims, there's no evidence that the above numbers represent a drastic change from the past, at all.

Why, one might ask, are these numbers not surprising, why are they not unusual? Well, to understand the realities here, one must first set aside politics and ideology and think about what is really being measured and observed: change over time from the bottom 20% of the "income ladder" as opposed to the top 20%. And one must also understand the assumptions built in here: that the goal in life--for everyone, apparently--is to either stay in the top 20% or work one's way into it.

It's at this point that reality provides a smackdown. Note that the "bottom 20%" and the "top 20%" will always exist in full. No matter what the government does, no matter what we--as individuals or communities--do, there will never be, there can never be absolute equality of income (or of wealth). Lines can always be drawn, establishing the different quintiles (a quintile is 20% of the whole; there are always five in total). Thus, some 20% of the population will be the bottom 20%, while some other 20% will be the top 20%, with respect to income, at any given moment in time. Why am I harping on this? Because understand that for every person or household moving out of the bottom 20%, another person or household must move into it to maintain the ratios. If all of the members (or children of the members) of the current bottom 20% were to somehow increase their incomes and jump up to a higher quintile, there would still be a bottom 20%, an entirely new group in such a case, but a bottom 20%, regardless.

This is no less true of the top 20%. For every person or household that moves into it, another person or household must drop out of it. Thus, the issue of social mobility is very much a two-way street: one cannot go up without someone else going down (and vice-versa). Politicians, pundits, and academics talk about social mobility as if it were exactly what it is not, as if greater social mobility could somehow increase the size of the upper quintiles, while decreasing the size of the lower ones. 'Tis nonsense, utter nonsense. Yet, this view serves as a basis for policy, for a world-view wherein income inequality can somehow be abolished or at least severely curtailed.

Monday, December 2, 2013

An Orwellian future arrives in Caesarean section

In April if this year, MSNBC talked head Melissa Harris-Perry put forth the notion that children do not "belong" to their parents or greater families, that they in fact "belong" to the greater community as a whole. Such a rubric of (faulty) understanding leads to an obvious conclusion:
Shades of Nineteen Eighty-Four, of a society wherein everyone is obliged to serve the state first and foremost. Because look what Ms. Harris-Perry is saying: children don't belong to their parents--thus they are not really a part of a family--but rather belong to the community, apparently at the moment of their birth, if not before. And the "private" idea of children is wrong, they are "public" entities, a "public good" if you will. Communal property managed by the state, that is her definition of children.
At the time, Harris-Perry caught a lot of flak for her remarks, as I noted in the piece linked to above. Of course, her comments were mostly about the issue of education, the arguments over responsibilities for the same. Nonetheless, the existence of this mindset--the idea that children are somehow communal property--extends into other avenues, as well. And Harris-Perry is far from alone in this regard, as recent events across the Atlantic should now make clear. Crystal clear.

In the summer of 2012, a pregnant woman from Italy was in England for a training course. At some point, the lady apparently had a severe panic attack in her hotel room. She, herself, called the police, who arrived at her room, spoke with her and her mother (who was still in Italy, but on the phone), and decided to take her to the hospital for her own safety. Given that she was apparently in distress and very far along in her pregnancy, this was a completely reasonable decision in my opinion. Or so it would seem. I'll let Christopher Booker of the Telegraph explain what happened next (and if you're not already doing so, you probably want to sit down, breathe easy, and try to remain calm):
On arrival, she was startled to see that it was a psychiatric hospital, and said she wanted to go back to her hotel. She was restrained by orderlies, sectioned under the Mental Health Act and told that she must stay in the hospital.

By now Essex social services were involved, and five weeks later she was told she could not have breakfast that day. When no explanation was forthcoming, she volubly protested. She was strapped down and forcibly sedated, and when she woke up hours later, found she was in a different hospital and that her baby had been removed by caesarean section while she was unconscious and taken into care by social workers. She was not allowed to see her baby daughter, and later learnt that a High Court judge, Mr Justice Mostyn, had given the social workers permission to arrange for the child to be delivered. In October, at a hearing before another judge, she was represented by lawyers assigned to her by the local authority and told she would be escorted back to Italy without her baby.
Remember, this was in the summer of last year. The baby--a girl--is now fifteen months old, has never even been seen by her mother, and is being put up for adoption, by order of the British Court of Protection. Since birth, the child has apparently been under the care of Essex social services, even though there are extended family members available to care for the child, even though the child was born to an Italian citizen. All attempts by the birth mother, her family, and agents of the Italian government to obtain custody of the child have failed. Right now, the child's future will be in England, as a ward of the state, until an adoption is arranged to the benefit of some English family.