Tuesday, July 19, 2016

Michter's US-1 Barrel Strength Rye Whiskey

I love American Pie (the song, not the movie), there's no getting around that. I know all the lyrics of course and have many fond memories of Promo Night at the University of Miami Rathskeller, wherein last call was immediately followed by American Pie, oftentimes with some number of partially-inebriated students putting their arms around each other in large circles and singing along loudly with the music.

But that was the eighties. And for the most part, everyone was drinking beer, Absolut, Jack Daniels, or Bacardi (provided they weren't in the restroom doing a line, or in the alley taking a toke or two). When we heard about good old boys drinking whiskey and rye, we generally didn't know too much about it, one way or the other.

So for those unfamiliar with the terminology, rye is actually a kind of whiskey (not whisky, which is the spelling for such spirits emanating from Canada, Scotland, and Japan). And whiskey references most spirits that are distilled from fermented grain mash. By and large, most big name distilleries in the United Stated produce bourbon. Bourbon is whiskey made mostly from corn (there's almost always a mix of grains used, but if over half of the grain consists of corn, it's bourbon). But whiskeys can also be made mostly with malted barley (malt whiskey), with wheat (wheat whiskey), or with rye (rye whiskey).

Clear?

So, the good old boys weren't really drinking whiskey and rye (which I think many people thought was the name of a mixed drink). They were drinking whiskey and whiskey, or more probably bourbon and rye. And that's still a fine time, still a reason to be hanging out at a dry levee singing "this will be the day that I die"...

Anyway, in general I drink single malt scotch, red wine, or beer (none to excess, of course). I had mostly given up on mixed drinks, aside from the occasional Bloody Mary. But a recent dinner out at a Wynwood eatery with some friends opened a door for me in this regard. The restaurant was R House. Sitting at the bar there, I opted for one of their signature cocktails: a Russell's Reserve Old Fashioned. The drink was served with either bourbon or rye, but knowing a bit of mixologist history, I opted for the rye (Manhattans and Old Fashions are properly made with rye). And it was quite good. Quite.

So I started to have Old Fashioneds now and again when I was out. But I soon encountered some problems. Not every bar or restaurant stocks rye whiskey. Worse still, not everyone stocks bitters. So I thought I'd add a bottle to my home bar, thus allowing me to make my own from time to time. After spending a good hour looking at the various ryes at Total Wine, I finally opted for this one, Michter's US-1 Barrel Strength Rye:


If it looks a little low, well that's because I've been drinking it.

Now, if I was reviewing a scotch here, I'd go into detail about it's specific characteristics, about things like its finish and the various flavors on its palate. The same is true if this were about a wine. But we are talking American whiskey here. And I'm buying it to make Old Fashioneds (and the occasional Manhattan). So here's the deal, short and sweet: this is an excellent rye, by my standards. It puts Kentucky bourbons (like Jack Daniels and Wild Turkey) to shame when it comes to an Old Fashioned mix. If you like a good Old Fashioned, this is the rye for you, no question about it. It's perfectly spiced, nice and dry, and mixes cleanly.

All that said, it probably behooves me to go over the proper recipe for an Old Fashioned. Here it is:
Old Fashioned 
2 oz. rye whiskey
1/2 oz. simple syrup
3 dashes aromatic bitters
1 orange (preferred) or lemon peel twist
1 large ice cube 
Directions: Add syrup and bitters to an Old Fashioned glass (a short tumbler), followed by the ice cube, then the rye. Mix with small spoon or stick, then toss in the piece of orange or lemon peel. Sip.
Instead of simple syrup, a sugar cube can be used, but then it needs to be crushed in the glass first with a splash of water, as well. If you're at a bar, you order an Old Fashioned, and the bartender goes for some Maraschino cherries, please stop them. That's not an Old Fashioned. Look around on the Net and you might find some people insisting that the drink requires a mashed cherry (and maybe even a mashed orange slice). They're wrong. And by and large, they're also British. The Old Fashioned as an American drink, made with a truly American whiskey: rye. That's the way it is.

And again, if you're looking for a good rye for your Old Fashioneds, Michter's is a fine choice, especially given the history of the distillery. Cheers!

A brief history of plagiarism

Note: I found at least 14 pieces with this same (or nearly the same) title—"A brief history of plagiarism"—on the internet. And the title itself is based on and intended to evoke Stephen Hawking's A Brief History of Time. But I'm still going to use it. So there.
To quote from Merriam-Webster, plagiarism is "the act of using another person's words or ideas without giving credit to that person." Most of us understand this, know what is plagiarism and what is not. We learned about it in school at one point or another, usually from a teacher who told us to never copy someone else's words, but rather to restate an idea or the like in our own words. And it's not enough to just change a word or two in this regard; one needs to reorganize things as well, to restructure sentences, in order to properly restate an idea without plagiarizing someone else.

The root of plagiarism, plagiarize, and plagiarist is the Latin word plagiarius, which means "kidnapper." It is, itself, derived from the Latin word plaga (plural: plagis), which referred to a net used by game hunters. The relationship is obvious: tool for catching game to kidnapper to someone who steals another's words. And it is a funny testament to how language works, how it evolves. We often wonder—or maybe even complain—about how a word or one of its derivatives changes in meaning. More often than not, it's because someone used it in a figurative or non-traditional way and that usage resonated with others. Such was likely the case here, multiple times.

The first known usage of plagiarize in reference to word-theft was, again according to Merriam-Webster, in 1621, though "plagiary" (meaning "literary thief") entered the English lexicon some decades prior, perhaps in the 1590's. And in this beginning, the issue was very much a literary one, as it was used by people like the playwright Ben Johnson, among others, to complain about his work/words being stolen. The history of plagiarism with respect to the arts is, by and large, the largest component of the history of plagiarism. It continues to this day, in novels, movies, and music.

And this all points to a problem with my "brief history": there is the history of the term and its use (and how it became a legal term, as well) and then there is the history of the action, itself (using someone else's words as one's own).

The latter obviously goes back much farther in time. For instance, a case can be made that the great poet Homer was, in fact, nothing more than a great plagiarist, that his works were nothing more than the stories told by traveling bards put down into written form with no credit given to those whom originated the tales (to be fair, a case can also be made that there was no actual historical Homer). Then there's the Bible (yes, the actual Bible) and the issue of plagiarism within, particularly with regard to the story of Noah and it's similarities to parts of the Epic of Gilgamesh. Of course one can adopt a more nuanced view and allow that both are derived from the same source: tales handed down across generations. Still, in terms of our modern understanding of what constitutes plagiarism, there may be something here, as whomever tells a tale first tends to have ownership rights of the same, and those who retell it are usually expected to acknowledge the source.

But I guess the current concerns are more about legalities and people currently in the public eye who use the words and/or ideas of others without attribution. And in this regard, the history really begins in 1710, in England, with the passage of the Statute of Anne (which followed the lapsing of the Licensing Act in 1694, something supported heavily by John Locke, interestingly enough). In short, what happened here was that publishers lost their absolute control over copyrights and the government stepped in to protect the interests of authors (yet another in the long list of firsts for the English legal system). The statute was far from perfect, however, and it's flaws became apparent across time (leading to repeal and replacement), but it was the first big step in creating a legal framework that could include the issue of plagiarism.

Let's jump forward a bit, however, and look at the issue of plagiarism in the public (politicians and journalists) arena. Here's a recent story from The Guardian that details the plagiarism woes of current political faces, including Joe Biden, Barack Obama, Stephen Harper, and Ben Carson (also Maureen Dowd). And then there is the plagiarism of noted intellectual giant Fareed Zakaria, the self-plagiarism (yes, that's a real thing) of Jonah Lehrer, along with the plagiarism of political leaders like Rand Paul, Senator John Walsh, and Vladimir Putin. I'm not going to detail the rest of these cases, but suffice it to say that there's little room for doubt in them: all are guilty of plagiarism, the use of another's words or ideas without attribution.

Yet, despite this, they all push on. Most apologized for their "terrible mistake" or "momentary lapse," and ultimately suffered very little in their professional lives for their actions. Honestly, I have to admit that Zakaria's case bugs me the most. His plagiarism incidents spanned years (probably there are more that just haven't been caught) and he should know better, though I think in his case it was just pure laziness; he didn't need the angle, he didn't need to plagiarize, because what he took wasn't all that impressive and he is a smart guy.

But I digress.

The point is that these days, plagiarism seems to be this huge thing in the moment but then quickly recedes and becomes little more than a blip on someone's public resume. And by the way, I'm not interested in hearing about speechwriters and how some of these cases fall on them; whomever gives a speech—and functionally claims it is their speech—fucking owns that speech, in every single way, from every single direction. There's no one else to blame.

As I writer, I want to do a lot of blaming in this regard. I really do. I don't care if someone copies my words or ideas, as along as they give me credit, even if they get a lot more out of the words or ideas. The point is the credit and the intellectual honesty of giving it. In this regard, I'd point to Dan Brown (whose novels I like), The Da Vinci Code, and Brown's failure to credit the authors of Holy Blood, Holy Grail a source for many of the ideas in the novel. Two of the latter's authors sued Brown and lost, and maybe they should have lost, legally speaking. But it's obvious to me—having read both books—that Brown was using Holy Blood, Holy Grail as a source, and he should have acknowledged this. If he had, I would bet there would never have been any sort of dust-up, whatsoever. And it would have been the right thing to do, regardless (I could tell a similar story using James Cameron, Avatar, and a bunch of other people).

But getting back to the politicos, the issue of harm is meaningful here: is there any when a politician lifts a good turn-of-the-phrase or a good story from another politician? Damn right there is. The harm is not only to the person who originated the phase or the story, but to the voter who is getting duped. And it's a consequence of professionalism in politics, by and large, in politicians employing the aforementioned speechwriters. This is reflected on all of the stories on the history of this kind of stuff now rapidly appearing throughout the media: these "histories" don't go back very far. Most start with Biden. A few reach a little farther back, but much.

Why?

Well, once upon a time, the best politicians were wordsmiths who excelled not only at giving speeches, but also at writing them. As the latter function has been more and more frequently contracted out to speechwriters, the incidence and likelihood of plagiarism has increased dramatically. The puppeteering in the political realm is reaching new heights, I think, and it's not just in this arena. It's also in the creation of legislation proper, wherein our elected leaders employee others to not only draft legislation, but to also read the legislation drafted by others (so they, the elected leaders, can understand it).

I'm not going to offer a solution for these larger issues here, but when it comes to plagiarism by politicians and the like (including journalists), the answer is a simple one: we need to stop accepting it. Again, we're outraged in the moment, but that moment quickly passes. To use Biden as an example, he committed plagiarism on multiple occasions, going back to his law school days. He got caught multiple times, as well, and was forced to admit to his mistakes (really, I think he should have been expelled from law school, but that's on the school, not on him). One can say "good enough, he admitted he erred and we are all human," and that's true; he gets to move on with his life. But I am of the opinion that these sorts of transgressions should represent a death knell for public service, for elected offices. Biden—like others caught plagiarizing—needn't be pilloried for the rest of his life, but neither should we, the voting public, need to suffer such a person in office.

And it's here that people truly get the government they deserve. Because there are enough people willing to gloss over these incidents simply for partisan reasons. We need to wake up. Biden's political career should have been over. There's nothing so special about him; there are more than enough qualified people who could have taken his place. Ditto for Rand Paul and the other politicians caught plagiarizing. And double ditto for the journalists who did the same; these people pursued careers wherein honesty is a prerequisite. They effed up and should, I think, have to move on to something else. And we, as citizens, should have the integrity to force their hand in this regard, by not voting for them, by not listening to them, by not reading them. The fact that we don't means we'll just keep getting more of the same.

Friday, July 15, 2016

The return of Larry Darrell

Arise! Awake! Approach the great and learn. Like the sharp edge of a razor is that path, so the wise say—hard to tread and difficult to cross.—The Upanishads, 1.3.14
Life speeds away. In a moment it's gone, all of it, from the monuments people spend their lives building to the relations they spend their lives ignoring or forgetting. We look now to the implements of modernity for all things, for work, for friendship, for recreation, for security, and for connections to the world around us. As I sit writing these words, on one of those devices—a laptop—I periodically glance at another—a cellphone—to be sure no one is trying to reach me, to be sure that I am—for the moment—unencumbered by the world around me. Yet, I could just as easily detach myself completely, could turn the latter off and discard the former for a stack of paper and a pen. But I choose not to.

Outside, the sun burns brightly on a summer's day; it's late afternoon and, surprisingly at first, there are people out and about, on foot and on bicycles, traversing local paths and filling local parks. A renaissance of outdoor activity, it would seem, non-exist a mere seven or eight days prior, especially in the humid City of Dis that is South Florida. What has changed? Very little, in the physical world. But in the virtual world, Pokémon Go hit the app market fours days ago and now occupies the top spot for downloads, both in the Google Play Store and in the Apple App Store. And it's drawn millions of gamers out into the physical world for the promise of benefits in the virtual world. Because to play, users need to physically walk around searching for wild Pokémon to capture. Tweens, teenagers, and young adults are playing, mostly. But there's no actual age limit, as many adult professionals are fully engaged, as well.

Yet, the last week or so has also seen crowds in other places, unhappy crowds congregating for very different reasons than the promise of a Bulbasaur, a Charmander, or—for the very lucky—a Pikachu. The nation's soul has been rocked once again by a series of killings: first several cases of police officers killing black suspects—I'm being generous with the word "suspect," because as far as anyone can tell, one of the men killed did absolutely nothing wrong, whatsoever—in Baton Rouge and St. Paul, then a retributive attack on police officers in Dallas by a black shooter who was "looking to kill white people," especially those wearing blue. Predictably, there have been widespread protests over the former incidents in a number of major cities, along with non-stop outrage on social media and from the talking heads on cable news. And from the not-too-distant-past, the massacre in Orlando still afflicts the conscience of the nation.

Beyond that, there is Brexit, international terrorism (last night in Nice, France over eighty people were killed in an attack by another "lone" terrorist), the continued war against ISIS, the Zika virus, corporate greed, Russian oligarchs, Chinese sweatshops, Somalia, and the United States Presidential race between a widely disliked Washington insider and a self-important, self-promoting tool whose likes strutting his stuff in the WWE.

And then there's the day to day struggle to just survive, to put food on the table, to make ends meet, which—depending on the "where" and "who"—runs the gamut from true life-or-death decisions to worrying about one's market positions.

How can someone wrap themselves up in a pointless cellphone game—there really are no winners or losers in Pokémon Go—when #BlackLivesMatter? How can one worry about these relatively limited incidents in the U.S. when thousands upon thousands are dying or becoming refugees from war-torn areas in the Middle East? When millions upon millions are suffering all over the world, when children are going to bed hungry, when so many lack access to things like clean water or healthcare? When Climate Change threatens the world-as-we-know-it and the billions who live on it?

Where are the World's priorities? Where should they be?

But "the World" is not an entity. It never has been and never will be. The collective consciousness of "the World" or of a people, of a country, of a city is myth: there are no Zeitgeists, cultural or otherwise. There are only individuals, each with their own individual point of view, their own individual concerns and needs. And by and large, the vast majority of individuals on the planet instinctively know this; even most of those who are outraged on Facebook, on Twitter, or anywhere else are predominately driven by their own selfish desires.

That's why they play Pokémon Go and Angry Birds, that's why they watch Dancing with the Stars and go to baseball games. They live for themselves (and their loved ones, sometimes) and limit their selflessness to moments when it's convenient to be selfless.

I'm sure many people reading this are probably thinking I'm being overly critical, that I'm holding people to far too high of a standard. But that's not really it; the problem is that people have unrealistic views of the world, unrealistic expectations of their own agency in this world, and use both of these things unfairly as a cudgel against others, against governments, against political parties, against all kinds of groups, against corporations, and against all of mankind in general. They imagine that their empathy—and there's nothing wrong with empathy—when coupled with their support of this cause or that cause, with their liking or retweeting of this hashtag or that hashtag, has an efficacy is does not and cannot have. And at the same time, they imagine that the above is "enough," that in doing the above, they meet some sort of requirement that confirms they are "good people," leaving them free to do what they will with the remainder of their time and their lives, be that playing games, taking selfies, watching Netflix, or thousands of other past times. Oh, and of course going to work and getting paid, taking care of themselves and their family, going shopping, and basically just living life.

And they are fooling themselves in this regard. They're not good people or better people (than someone else) by virtue of their empathy and limited support of a cause. They're not bad people or worse people, either. By and large, they're just people. We all are. An individual life is a temporally limited thing. And it can come to a quick end in a variety of ways. When it does, Karma rarely—if ever—plays a role. People like to think that this is not the case, but that's functionally a means of feeding their own egos, of imagining—again—that there's significant efficacy to their lives, with regard to the universe as a whole.

All that said, life is not a zero-sum game. Because contrary to the philosophy of Ricky Roma, we don't actually keep score in life, he who has the most stuff (or the most fame) doesn't actually win. Life is a series of moments, of feelings produced by those moments, and the only score available is wholly internalized.

Do you feel good about yourself? That's the real question to ask. Are you happy in the moment? Why? Why not?

But it seems to me that people are becoming progressively worse at introspection, of seeking fulfillment and happiness from within. That is, I think, largely due to consumerism, to the non-stop bombardment of stimuli from the world. For my generation and those who have followed, the day is filled from start to finish with stuff, with things to do, to see, to hear, to experience. Moments of relative peace and calm during the daily bustle of work and school have disappeared, have become opportunities to check Facebook, to play Pokémon Go, to tweet something insignificant, to talk about the latest episode of Game of Thrones or of Orange Is the New Black, or to buy a ridiculously over-priced coffee drink. Downtime means more of the same, or perhaps some binge-watching of one old TV series or another.

And there's an element of selectivity in all of this (when it comes to media), as well: people are becoming used to getting exactly what they want when they want it, thanks to video services like Netflix and to DVRs. Aside from the unbridled enthusiasm of waiting for the next episode of an over-hyped series or for the next big superhero movie, there are no pleasant surprises anymore, there's almost no channel-surfing, even. It wasn't all that long ago that people were talking about the over-abundance of choices on TV because of the advent of Cable. 100+ channels meant 100+ choices. That ship has sailed. The choices now—because of streaming services—number in the tens of thousands, easy.

There's always something to do or to watch, because of technology and a wide-ranging consumer economy. Always.

Kilimanjaro, from Sierra Mountaineering Club
Why bother to think, anymore? Why bother to examine oneself or ponder something as mundane as the meaning of life? Journeys of self-discovery have been replaced by binge-watching, MMOGs, Vines, and pictures of food. The wonders of the world are available in super high definition with the touch of a button; one can be amazed without ever leaving home. And when one does leave home, virtual reality is trumping actual reality now.

We are losing something important, I think. The sheltering sky is almost gone, as is the top of the world, and the cradle of civilization, as sources for deeper thinking and understanding of the world and of who we are. Living is becoming progressively easier, while Life is becoming harder.