In Defense of Star Wars (And just a little bit, commercialism.)

In Arts and Education on December 30, 2015 at 9:31 pm


the-force-awakensWay back in 2011 I wrote a piece on the intersection of art and commercialism that’s been a steady workhorse for me over the years, driving traffic to my blog, never in large numbers at a time, but a little here and a little there, and adding up to quite a bit. With the arrival of the new Star Wars, I thought it might be fun to check in on the topic again, specifically whether or not it’s fair to dismiss Star Wars as less than art.

The short answer is: no.

Sure, it’s made a gazillion bucks, and for that reason alone it’s excusable to approach the film with a healthy dose of skepticism. After all, buck-making is a big deal on our side of the pond, and capitalism ensures the drive to make big bucks. But Star Wars isn’t capitalism incarnate—it’s commercialism, and commercialism, by definition, needn’t suck.

Think of commercialism as a sometimes (but not exclusively) “dirty” art; an art form fallen from the pure faith. Consider: there’s creativity, and discipline. There’s expression, communication, and emotion. There’s also the sharing of experience between creator and consumer. Arguably the only things separating commercialism from so-called “high art” are:

  1. The number of creators behind the project.
  2. The intent for mass appeal.

But these points are both misleading. For instance, the number of musicians on stage producing a Beethoven symphony doesn’t dilute the product. And what artist doesn’t want to pack the house?

Enter LA Times’ Michael Hiltzik, and this article in which he asserts Star Wars: The Force Awakens can’t rightly even be called a movie.

The simple answer is that “Star Wars: The Force Awakens,” is not very good. It’s professionally made in the sense that it displays an industrial level of Quality Control. But it’s depressingly unimaginative and dull in long stretches, and — crucially — reproduces George Lucas’ original 1977 movie slavishly almost to the point of plagiarism.

This isn’t to say that it’s not an enjoyable way to spend a couple of hours. If you’re among the millions who plainly have done so, bless your heart. The issue, however, is whether “The Force Awakens” even deserves to be considered as a movie, because it’s not. It’s the anchoring element of a vast commercial program, painstakingly factory-made for maximal audience appeal, which means maximal inoffensiveness. The result tells us a lot about the state of entertainment today, and about the future of Hollywood.

Reading Hiltzek’s criticism of Star Wars, I was struck by its lack of self-awareness, bless his heart. His issue with Star Wars, ostensibly, is that it’s derivative. But there’s a deeper argument being made that badly undercuts the first, and it goes like this: Commercialism isn’t art. Commercialism, by definition, is crap. Commercialism, and the blockbuster specifically, is ruining Hollywood.

Now if this kind of criticism sounds familiar, it’s because it is. To borrow from Hiltzek’s own Star Wars review, his criticism reproduces hundreds of years of bad criticism, slavishly, almost to the point of plagiarism.

Everyone say it with me, one more time: commercial appeal is not synonymous with low-quality art.

Commercial appeal simply means a thing is popular. The problem comes when one tries to assert that commercialism is by definition “not art.” That’s what’s known as a good old fashioned fallacy by extension. (And if you’re new to this debate, let me introduce you to a guy named Andy Warhol.) But let’s consider the other point—the slavish reproduction of Lucas’ original.

The Star Wars saga has always been about the Skywalker family. Its intent is to tell a generational story. It’s in the DNA of the piece—the sonata allegro form, if you will. Of course it’s derivative of itself. Mix one Skywalker family soap opera, one superweapon threat, some Force mumbo-jumbo and a few lightsabers, and: boom! You’ve got a Star Wars movie.

(Sorry LA Times, I meant to write, Star Wars “anchoring element of a vast commercial program, painstakingly factory-made for maximal audience appeal, which means maximal inoffensiveness,” but jeez, that’s a mouthful—can we just admit it’s a movie now?)

Let’s cut the crap. The movie is a media darling. Critics are largely delighted for the fresh blood, as evidenced by its oh-so-coveted Rotten Tomato score. So what gives with the LA Times thing? I should think it’s obvious; some folks won’t be happy until popular art strives to be “high art,” no exceptions. And no, I don’t believe them when they say, “It doesn’t have to be ‘high art,’ it just has to be…[insert something else, anything else.] That’s a non-argument. Constructive criticism is evaluating a piece for what it is—not what you want it to be. If your argument is we can’t call a movie a movie, you’ve essentially tossed reason out the window.

But wait—it gets worse.

Criticism of this kind isn’t just annoying, it’s harmful—it damages the perception of high art in general. And there’s an important place for high art in our society, arguably now more than ever. High art elevates human experience. High art is set above the average fare for good reason—it’s set above by virtue of how damn good it is. Period.

And it ain’t easy to produce. What’s more, it pays shit, and, in any given audience, at any given time, a sizeable portion of the consumer isn’t capable of even giving a damn. Maybe they’re tired. Maybe they’re under the weather—I don’t know, but anyone who has served their time in the arts sector knows what I’m talking about. That fat cat who’d rather write a check to the New York Phil than be forced to sit through a concert. The guy in the third row who thought it was cool to leave his cellphone on while Daniel Barenboim was conducting Mahler. The world is full of people like this, and I don’t say that to judge, I say that to speak to the balance of power between performer and consumer in the realm of art.

What percentage of people who came to hear Bruckner (bad example?) will leave the hall invigorated by the experience? Does the LA Philharmonic—which seems by all accounts to be a vibrant organization—command the same visceral response from its patrons as My Chemical Romance? Is that a bad comparison? You tell me.

Point is, popular art asks very little of its consumer, while high art asks a lot. For most average folks, Art with a capital A tastes like medicine. This isn’t news.

Enter the snobby critic. Having established that it’s hard enough for the average consumer to “appreciate” that low budget art-house film, (and let’s not hold back—let’s say the goal is to really appreciate) now the snobby critic wants to “snob” me into seeing it? What does s/he suppose my disposition will be, going in?

Fact is, snobby critics drive audiences away from the very art form they claim to cherish. No one fun wants to hang out with a snob—not even other snobs. And why would they? Snobs don’t enjoy much of anything. Nothing’s ever good enough for a snob. The wine is never the right vintage, the risotto always sucks, and there’s always something nearly detectably wrong with everything. Snobs are the spoiled brats of the art world. Is it any wonder that Hollywood gives us 99 percent superheroes and lightsabers and one percent art-house films?

If you’re a snob, and you truly care about the arts, it’s time you recognized that the arts are all one big family. There’s nothing wrong with having good taste. There’s nothing wrong criticizing popular art. Star Wars, Adele, James Patterson—you name it. Chop away at plot holes, bad intonation, or hired help writing novels with your name on it—have at it—but there is something wrong with the brand of criticism that says: this book ain’t a book, this album ain’t an album, and this movie ain’t a movie. That’s the definition of intellectual dishonesty—and it ain’t furthering your cause.

Oh—and Adele, if you’re reading—hello.




Know Less, Know More

In Arts and Education, mastery, Polemics, politicians, Politics on March 30, 2015 at 4:15 pm

It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to Heaven, we were all going direct the other way—in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only.

Mr. Dickens eat your heart out.

We’re a nation of experts, we’re a nation of idiots, we know what we know, except to not know, or know less. Comparison loves antitheses, despises uncertainty.

But we need uncertainty.

Counterintuitive? Maybe, but uncertainty doesn’t always mean indecision.

We’ve come to expect sharky debate. Wait-and-see gets exploited, so it’s avoided, and not having answers looks dumb, especially on national T.V., but for the person practicing a principled uncertainty, these are learning moments. Data-gathering missions. Necessary episodes in order to form a more complete picture—a picture unknowable until a singular moment in time.

It’s known as the master’s journey.

Mastery resists foolishness. Most of us want results, want them now, but mastery pushes back. Its trajectory isn’t always straight, and it chucks sandbars at you when all you want to do is stretch out and swim. Mastery won’t play by your rules, dang it, but that’s okay, you can play by his, and that’s healthier.

Practice. When we practice, we get better, (as my son’s Taekwondo instructor is fond of saying) but practice also teaches us what to value, product, or process. Too bad so sad, Mastery happens to believe in process, believes in it so much, he likes to knock us on our butts when we don’t hup two.

Socially, we’re expected to practice (there’s that word again) clearly defined political ideologies, and for the most part, we conform. We consume media reinforcing our world views, we practice anger at those who believe different things, live different lifestyles, vote for different candidates, we’re comfortable (usually) knowing what we know, feeling what we feel, and believing the world would be better off if everyone else could just get with the program.

In short, we’re all too dang sure.

Knowing what we don’t know. We’ve all heard that phrase, or some version of it, and maybe you’re someone who’s a natch with humility. I’m not. I’m a natural blow hard. (Can you tell?) Worse, I must have some kind of internal mechanism, a monthly quota for getting myself stuck out on a limb (at least once or forty-three times, depends) and then I feel like an idiot. There, I said it. Problem is: I’m a closet theistically-inclined agnostic, a liberal-minded conservative, a superstitious rationalist. In short, I’m a hot mess of contradictions.

But I don’t think I’m the only one. In fact, I think it’s our internal contradictions shaping the world into right and left, believers and infidels, good and evil, because that’s easier to process, and we enjoy practicing easy, because easy. But what happens when we practice?

We get better.

Deep down, we know the world doesn’t compartmentalize so easily. Life is a spectrum of experiences. Sometimes we’re reminded of this, get slapped down, maybe a well-intentioned friend challenging our ideas, maybe we resent looking wrong. (No, not being wrong—are you crazy?) Anyway, we’re not prepared for it. We forgot to practice for this moment. We don’t know what to do. Maybe we get angry—but hey—no sweat: We practiced that, remember?

Uncertainty really does have its virtues. It closes fewer doors, for one thing. Recall earlier I made a distinction between uncertainty and indecision. In music, for instance, that might translate into retaining options; a jazz improviser will have practiced multiple riffs, spontaneously choosing A, B, C, or D, as context informs which way to go. A concert cellist (something I know more about) practices similarly. A passage might go faster or slower when played with others, with one fingering or bowing working better in this context or that, depending. In practicing flexibility, we get better.

Mastery is about (among other things) principled uncertainty. That’s not naval gazing, it’s simply admitting we don’t know everything. And what’s wrong with that? We can’t know the future. We can’t be right one hundred percent of the time. Uncertainty is a thing to include in practice. It helps us to retain options and a facile mind. It promotes a world with fewer experts, and who wouldn’t want that?

Our Republican Congress

In Politics on March 10, 2015 at 6:38 pm

Imagine a Tom Cotton letter during the Cuban Missile Crisis. Imagine a Tom Cotton letter going to Gorbachev.

This really honks me.

Is it treason? Maybe if words meant things. Hyperbole notwithstanding, treason is the crime of betraying one’s country. Sen. Tom Cotton (R-Ark.) admits undermining the President’s negotiations are a “feature, not a bug.” Looks clear to me.

I’ll grant you, betraying the President isn’t necessarily the same thing as betraying the country. But, like it or not, we elected Obama, his administration is charged with crafting our nation’s foreign policy, serving on the United Nations, negotiating (or not) with Iran.

Tom Cotton isn’t.

In fact, there is a Foreign Relations Committee, led by Bob Corker (R-Tenn.). Sen. Corker is trying to cut a deal too, a bi-partisan bill ensuring Congress has the chance to approve the President’s deal (really the United Nations’ deal) with Iran before it is approved. If Sen. Cotton was serious, he’d work with his own government (and/or his own party) before moving on to Tehran’s.

Much has been written about Obama’s so-called “Imperial Presidency,” but perhaps little as revealing to the GOP mentality as this, Sen. Cotton’s short-sighted undermining of the executive branch as a whole:

…the offices of our Constitution have different characteristics. For example, the president may serve only two 4-year terms, whereas senators may serve an unlimited number of 6-year terms. As applied today, for instance, President Obama will leave office in January 2017, while most of us will remain in office well beyond then—perhaps decades.

What these two constitutional provisions mean is that we will consider any agreement regarding your nuclear-weapons program that is not approved by the Congress as nothing more than an executive agreement between President Obama and Ayatolla Khamenei. The next president could revoke such an executive agreement with the stroke of a pen and future Congresses could modify the terms of the agreement at any time.

Most senators stay on for decades. Sadly, this is just one of the incontrovertible failures we Americans have chosen to live with, the Kardashianization of our campaign financing; but the real power isn’t the scarecrow they hold up to call “dictator,” the real power is the loony minority fringe that keeps taking the country hostage, because drama.

On the world stage, this weakens, no—betrays—our country’s best interest. By Sen. Cotton’s logic, any foreign nation negotiating with a sitting president is doing so at their own peril, nothing from the United States of America can be trusted. Or, short of that, it can be trusted only as far as you can trust the craziest person in Congress.


Get every new post delivered to your Inbox.

Join 2,653 other followers

%d bloggers like this: