Category Archives: Topical Decoctions

Houdini Now and Then – Caught on the Web

This article originally appeared in The Mandala Magazine (2:5), April 2012

Houdini Now and Then
:
Caught on the Web

It’s tough being a fan of the Great Houdini. Your non-magician friends quickly grow tired of hearing you say “Watch me escape from this” or “Tie me up! Tighter!” The patience of your significant other wears thin as you beckon “Look at this photo of the fourth milk can!” And your magician friends who are not fans of HH (a defect we fans describe with the phrase “just doesn’t get it”) are likely to respond with “You know, he wasn’t really much of a magician” or “You know, Vernon fooled him with a double” or “You know, he was sort of an arrogant bastard to… well… everyone.”

Houdini, Germany, ca. 1902 (John Cox Collection)
Houdini, Germany, ca. 1902 (John Cox Collection)

OK. Yes, we know. Even so, there’s just something about Houdini the man and the myth. And being a fan is no longer about becoming Houdini (though for some it once was). Nor is it about defending Houdini. (Well, maybe a bit.) It’s about appreciating two interwoven themes in the life of Ehrich Weiss: a tragically imperfect pursuit of the American Dream and a splendidly perfect example of magical theatrics. The actor lived a life, not always well, but the character he played projected a fiction, always magnificent.

Weiss came as close as anyone to embodying the formula that Drive plus Opportunity plus Intelligence plus a dash of Charisma equals Success. Ehrich is the little guy, the underdog, the undereducated middle child of an impoverished immigrant family with no advantages. Unpolished, unsophisticated, and unpromising, he falls in love with magic (as each of us has done) and with the stage (as many of us have also done). He tolerates his miserable life in a New York sweatshop by dreaming big dreams and harboring unlikely ambitions. Finally, against all good judgment, he goes for broke and pursues a life in show business. And hundreds of odd engagements and thousands of days later, broke and broken is precisely where he ends up. Then, on the brink of failure and defeat, he’s discovered, coached, funded, and placed on a short path to unparalleled fortune and glory. By cultivating his uniqueness, working hard, and never giving up, Ehrich Weiss becomes the Great Houdini. Continue reading Houdini Now and Then – Caught on the Web

In the New York Times, Verlyn Klinkenborg Gets It Wrong

Verlyn Klinkenborg has written an op ed called The Decline and Fall of the English Major in which he starts with his students' inability to write and winds up discerning a "literal-mindedness in the recent shift away from the humanities". The apparent goal of the article is to defend the value of the humanities. However, the editorial has two weaknesses that undermine that goal.

The first weakness arises in the attempt to define that value. The author reduces what the humanities offer to mere writing— to clear composition. "They don’t call that skill the humanities. They don’t call it literature. They call it writing," explains Klinkenborg, who also asserts that undergrads do not know "how valuable the most fundamental gift of the humanities will turn out to be. That gift is clear thinking, clear writing and a lifelong engagement with literature."

So the value proposition of the humanities is reducible to clear thinking, clear writing, and a literary hobby. If that's all the humanities can offer, then why not eliminate every humanistic discipline other than composition and informal logic?

The humanities must be defended, if at all, on a much broader and deeper basis than this. To defend them merely because they build communication skills is to provide a tacit argument for superseding them with more efficient means toward that goal.

This fault in the editorial is joined to another. Klinkenborg writes: "…a certain literal-mindedness in the recent shift away from the humanities… suggests a number of things. One, the rush to make education pay off presupposes that only the most immediately applicable skills are worth acquiring…. Two, the humanities often do a bad job of explaining why the humanities matter. And three, the humanities often do a bad job of teaching the humanities. You don’t have to choose only one of these explanations. All three apply."

Whether these are genuine faults or merely perceived ones hardly matters in view of one overriding concern: if the humanities are so excellent at developing clear thought and clear verbal expression, then why do "the humanities… do a bad job of explaining" their value, and why do "the humanities… do a bad job of teaching the humanities"?

It seems reasonable that if the value proposition of the humanities consists of "clear thought and expression", then explaining the value of, and teaching, the humanities should be a slam dunk (and should be perceived as such). But if "the humanities" do a poor job of explaining their value and communicating their methods, then why believe in the first place that effective communication is a likely outcome of humanistic education?

Note– I'm all in favor of the humanities. Because of my humanistic education, I look askance on weak arguments and outright contradictions. For this reason, I don't like to see the humanities defended by a reduction to "clear thinking and writing" on the one hand and, on the other, by a contradiction of their efficacy at precisely that juncture.

Dogged determination

I once heard Phil Leider say of Francisco Goya that he had only ever truly longed for two things: the career of Diego Velázquez and the love of the Duchess of Alba.

Solo_Goya
The Duchess of Alba in Mourning, 1797, collection of the The New York Hispanic Society (The Hispanic Society of America)

Maybe that's so.

His last duchess Goya depicted several times, most memorably in an enigmatic painting in which her defiant stance seems to contradict the connotations of her mourning apparel. She points down toward the dust at her feet, where some finger — his? hers? — has inscribed "solo Goya".

Seems like something's going on there. He kept this painting among his possessions from the time of its creation until his death in 1828.

However that may have gone, Leider was surely right about Velázquez, the greatest Spanish painter of the 17th-century, and maybe the greatest of them all — the painter of whom Ruskin supposedly said that everything he does "may be regarded as absolutely right" and to whom Ruskin ascribed "the highest reach of technical perfection yet attained in art."

Why wouldn't Goya want to be Velázquez redux? The earlier artist had lived a charmed life as court painter to Philip IV, under whose auspices he cranked out not only a seemingly endless supply of stock portraiture, but also some of the most psychologically and intellectually compelling images in western art.

It didn't matter what Goya wanted, though. It was not to be. Living through the French Revolution and the Peninsular War, Goya was surrounded by destruction, corruption, incompetence, and folly. Sure, he became court painter — nominally the same position Velázquez had held. But Goya's monarch was an imbecile surrounded by monsters. Recognizing the sad irony of his plight, Goya pulled no punches when it came time to speak truth to power. Continue reading Dogged determination

Steve Jobs and Machine Beauty

With the Facebook Timeline just around the corner, and with Steve Jobs shuffling off this mortal coil, I'd like to consider what makes some technologies so different, so appealing.

Last night I asked my art history students what was distinctive about the contribution of Steve Jobs. A few compared him to inventors such as Edison or Tesla. A few looked for an answer in his emphasis on design. I joined the second group and challenged the first by pointing out (as The Economist had already done with great clarity) that Jobs had invented none of the technologies or devices for which he's best known: the mouse-driven computer, the digital audio player, the smart phone, and the tablet. But I also pressed that second group with a follow-up question: if his contribution had to do with design, not invention, then just what was the nature of his contribution to design?

The ensuing discussion was brief and stimulating. After the students had shared their views, I shared mine: I think Steve Jobs emphasized machine beauty with such focus and force that he made the artificiality of devices disappear. Calling him "The Magician", The Economist ascribes to him the ability to connect emotion to technology:

"His great achievement was to combine an emotional spark with computer technology, and make the resulting product feel personal."

Almost. It is the relationship we have with ourselves and our own capabilities that is emotional and personal; Jobs introduced into this already extant feedback loop a device which amplifies our self-signal without getting in our way. Rather than wallow in the narcissism of self-admiration as we see our latent powers amplified, we call the device itself cool. But whenever we call a device cool, what we mean is that it can easily make us more powerful in a way we desire. And that's cool.

What is machine beauty? The clearest and most useful answer to this question comes from David Gelernter (innovator and former patent-holder of the Lifestream technology, which has been at the center of consequential litigation involving Apple). Many stakeholders have by now laid claim to this concept, and perhaps we'll have a post here someday on the idiocy of many software patents, the Peter/Paul problems in patent granting, and the incoherence of the very idea of a software patent. For now, though, I want to bracket out the question of Apple's possible employment of Microsoftian market practices. Gelernter is noteworthy here not just because of his technological innovation, but also because he thinks deeply about the usability of machines, about art, and about beauty.

In his terse, punchy book Machine Beauty, Gelernter proposes a simple definition of the factor that distinguishes great technologies: machine beauty is the well-balanced integration of simplicity and power. Consider technologies that consists of devices. A device may be powerful but not simple; it requires the user to learn, study, and practice. A device may be simple but not powerful; it's hardly worthy of attention, so weak is the signal it delivers. And a device may be neither. But the device that manages to empower the user with virtually no learning curve is machine-beautiful.

The iPhone exemplifies this delicate balance. One day there was no iPhone; the next day there was an iPhone. And on that next day, children and elders, techies and Luddites, the deft and the daft— these were all standing around Apple Store displays and using the iPhone, with no instruction, to do things they wanted to do that they had previously been unable to do so efficiently, transparently, and enjoyably. Machine beauty.

Here, then, is a third question: why do we value technologies that are machine-beautiful?

I think it's easier to frame an answer to this question if we think about technologies in the way I recommended in my earlier post on Rodin's The Burghers of Calais:

I prefer to emphasize that technology always stands in a certain relation to the people who use it: technology is anything that amplifies what the human body can already do. A club amplifies the ability to punch. A gun amplifies the ability to throw. A telephone amplifies the ability to shout. A motor vehicle amplifies the ability to run. Clothing amplifies the protective and insulating qualities of skin. Architecture, oddly enough, is large, static, communal clothing. Telecast media amplify vision or audition. The hard drive and RAM of a computer amplify the ability to remember and to calculate. And so on.

Any technology may be understood this way, and therefore anything that acts as a force multiplier on what humans in general can already do may be construed as a technology.

If we take technology in general as any means of converting our existing capabilities into superpowers, then the appeal of a machine-beautiful device is immediately apparent: the power of the device makes us harder, better, faster, stronger, and the simplicity of the device spares us from having to think too much about the device itself. The technology is a nearly transparent biomodification that empowers us to do with facility from now on what we could do only at great pains before.

The distinctive contribution of Steve Jobs, as I see it, is that he created a post-now class of consumer citizens: the Cybourgeoisie.

Inception: A meta-magical matryoshka

This review of Inception contains light spoilers.

We've just returned from seeing Inception at the local IMAX.

Christopher Nolan has created a masterpiece of communication. A sci-fi action drama about lucid dreaming, Inception is an expertly acted, character driven tale about a man wracked with guilt and regret who wants nothing more than to be reunited with his family. While the special effects deliver in a variety of ways, the film's most satisfying feature is its self-referential plot structure. The plot itself is simple and conventional: a man who stands accused and has no prospect of exonerating himself has to find another way forward, so he assembles a motley team to pull off one last job. The pleasure in the plot structure lies not with the plot but with the structure, a meta-magical matryoshka.

Nolan proposes a nest of stories four layers deep in which the successful resolution of each layer's conflict depends on success in the next layer down. Since each layer operates on its own time scale — lower is slower — the film builds suspense by stretching the spring loaded telescope as far as it will go and then allowing it to snap back all at once. A focus on the remote becomes insight into the immediate as the audience wonders whether the force of the retraction will shatter the lens that looks out onto reality.

Michael Caine makes a cameo, but the chemistry belongs to Leonardo DiCaprio and the splendid Ellen Page as sympathetic figures whose relationship is a gentle dance of developing friendship and trust. Inception is not a probing exploration of character and meaning, so most of the characters in the film lack depth and predicates. (Not by accident, the film lays a foundation for justifying thin characters and abbreviated context.) But there's enough between them– enough that resists exasperating conventions– to lend humanity to what is essentially a sci-fi contrivance in which the mise en abyme is what really matters.

Nolan's most remarkable achievement here is the clarity of the communication. Despite the complexity of the nested layers, their temporal differences, and the interconnections of plan and potential that motivate each plunge within a plunge, Nolan sustains a clarity of exposition that brings the audience along with enough understanding to appreciate the structure and texture of the journey. By giving each layer its own look and feel, and by providing dialogue that draws analogies to video games, childhood memories, and the art of M. C. Escher, Nolan renders his four-tiered Inferno intelligible and unforced.

Most admirably, Nolan does this without resorting to the pseudo-technical jargon or cheesy special effects that a less effective storyteller might have employed in a risky attempt to acquire buy-in. Instead, he launches a simpler craft and then never lets the win out of his sells. Nolan's simple exposition and clear visual differentiation make navigation a blast, if not a breeze. When the narrative unwinds, the wave is a thrill that tickles the brain. The plot contains no secrets, no twists, and no Shyamalanisms. It's not about guesses, but ingresses. The movie answers the first question it posed and then lands just where the viewer has been taught to expect and hope it will go.

The satisfaction is in the going.

I knew him, Horatio: a fellow of infinite pecs.

In his op-ed on Monday, David Brooks revisited the father of our country and paid wistful attention to the mythic figure's concern for dignity.

When George Washington was a young man, he copied out a list of 110 “Rules of Civility and Decent Behavior…."   They were designed to improve inner morals by shaping the outward man. Washington took them very seriously….  In so doing, he turned himself into a new kind of hero. He wasn’t primarily a military hero or a political hero.

What kind of hero was Washington?  Brooks adopts the words of a historian:

[Washington] "was acclaimed as a classical hero because of the way he conducted himself during times of temptation. It was his moral character that set him off from other men."

To be a political or military hero, one need only win; to be a moral hero, one must seem worthy of the victory.  By 1796, largely thanks to the efforts of Thomas Jefferson, the French neo-classical sculptor Jean-Antoine Houdon had captured this dignity in stone:
Here the gentleman farmer and surveyor, the commander and citizen, stands erect with chin up and rests his left arm on a fasces, a symbol of the Roman republic.  Washington's sword-bearing hand now guides a cane.  His weapon, the sheathed sword of state, hangs opposite on the symbolic post.  One can well envision this Washington declining to become emperor, as the story goes, and choosing instead to step down after his second term for the sake of this nascent democracy.
The conventional wisdom about George Washington is that he was all three: a great general, a beloved statesman, and a prudent, self-governing man.  Nowadays, we still have victorious generals and accomplished politicians.  But dignity, the quality that demonstrates wise self-regulation, has vanished from the scene:

http://www.flickr.com/photos/wolflawlibrary/ / CC BY-NC-ND 2.0 Houdon, George Washington, Virginia State Capitol, 1796
http://www.flickr.com/photos/wolflawlibrary/ / CC BY-NC-ND 2.0

Houdon, George Washington, Virginia State Capitol, 1796

…the dignity code itself has been completely obliterated. The rules that guided Washington and generations of people after him are simply gone.

Brooks mentions a few politicians who have become all too familiar to us in ways George Washington never was.  He has a point; it is difficult to think of any figure in the public square who maintains that sort of dignity and commands that sort of respect.  To find a suitable analog, we have to turn to contemporary fiction.  Science fiction.  Interlarded with heavy doses of science fantasy.
We have to turn to Admiral Adama from Battlestar Galactica.  As the BattlestarWiki explains:

Adama has the rare combination of qualities that make up a good leader: insight, the ability to naturally command respect, a common touch that enables him to relate to the enlisted personnel under his command as well as his officers, intuition, intelligence, a strong belief in his own abilities, and the ability to take the advice of others. These qualities are reflected in the fact that personnel of all ranks aboard Galactica hold him in high regard….

 

Edward James Olmos as Admiral William Adama
Edward James Olmos as Admiral William Adama

Sure, Adama has his issues.  However, he keeps them in his quarters and always presents a dignified face to his people.  He believes that they deserve nothing less than a steady hand at the helm.  And sure, there are those in his fictional world who question Adama.  There are even some who rebel against him.  But most are fiercely loyal to him.  Even some sleeper agents planted in his crew by the enemy find his character so compelling that they choose to stand with him, come what may.  This loyalty attaches neither to Adama's military victories nor his political maneuvers, but to his virtue.  One close colleague explains the allegiance of Adama's people this way: "They're doing it for the old man!"

When it comes time to stir up dissent, Adama's insidious adversary, the community organizer Tom Zarek, compares Adama's return to that of a Greek god: "Zeus has returned to Olympus."  The comparison is cynical.  The gods are capricious, mad with power, and all too human; their dignity is a sham.  Of course, in the world of Battlestar Galactica, most humans believe in these gods.  The humans are hellenistic polytheists, while the robots and cyborgs are monotheists– an intriguing domain for thematic development in the series.  So when Zarek compares Adama to Zeus, neither man believes in Zeus but both understand that most of Adama's followers do.  Aiming to offend, Zarek implies that Adama is imperial rather than democratic, the de facto god of his people.

Here, the comparison between perceptions of the real George Washington and projections of the fictional William Adama becomes strained.  For it was quite reasonable to present the founding fathers of the United States by way of Roman republican iconography that reinforces our most cherished political values, representative government and the rule of law.  Right?  But no crackpot would ever, ever compare Washington to Zeus.  Certainly not in earnest.  Certainly not in the form of a gigantic, fantastically expensive, state-commissioned sculpture intended for display in the nation's most hallowed halls.  Right?  RIGHT?!

Not so:

http://www.flickr.com/photos/wallyg/ / CC BY-NC-ND 2.0The King is in the Altogether! Horatio Greenough, George Washington, 1840, National Museum of American History
http://www.flickr.com/photos/wallyg/
/ CC BY-NC-ND 2.0
The King is in the Altogether!

Horatio Greenough, George Washington, 1840, National Museum of American History

The plot thickens, but I need a drink.  Let's continue in a separate post.

News, nihil obstatrics, and gynecommodity

In the gossip-driven feeding frenzy that keeps alive the tawdry tale of rising and declining wannabe John Edwards (now with video), the New York Daily News wins quip of the day :

Hunter had been hired by the Edwards campaign to videotape the candidate’s movements, but this one is said to have shown him taking positions that weren’t on his official platform.

The commodification of sexual scandal is nothing new, of course, and in times like these more than ever the media are motivated to regard as "news" whatever will maximize sales.  Thus, there's a regrettable tendency to spew rather than eschew.

What's cheapened in yellowing press, beyond the players' tattered reputations, is a factor arguably worth conserving: the vitality of sexual allusion as a literary device.

For some of their puissance, these worthy tropes depend on indirection– a wink, a nod, a knowing glance.  But in a cultural milieu where everyone seems to say entirely too much altogether, and where even the king is in the altogether, it's hard for prose to play allusively without seeming turgid.
Continue reading News, nihil obstatrics, and gynecommodity

Hillary in Analog

Hillary ClintonAbout a week before Christmas, on a particularly slow news day, Drudge posted a photo of Hillary Clinton that had the blogosphere all abuzz. Ann Althouse gathered and summarized the relevant lines of commentary. It seems that some were shocked by Hillary’s weathered appearance, but some were shocked that others were shocked. Some, like Volokh, liked her look, but others didn’t and poked around in search of a double standard. Still others maintained that attention to unflattering photos is nothing new and not aimed only at candidates who are women. By way of contrast and context, Althouse also reproduces a photo provided by one of her readers.

Hillary Clinton

Those lines of inquiry and speculation are interesting, but seeing this photo reminded me of an article by the art historian Sheldon Nodelman. In “How to Read a Roman Portrait”, Art in America 63 (Jan/Feb 1975), pp. 26-33, Nodelman turns to the heyday of Roman portrait sculptures and asks why some of them seem strikingly naturalistic and unflattering while others seem notably idealized. What he has in mind is the contrast between the portraits such as this anonymous bust from ca. 80BC, now in the Palazzo Torlonia in Rome, Roman Republican portrait bust and this bust of Caesar Augustus from 50 to 70 years later, now in the Herbert F. Johnson Museum of Art at Cornell University.

Roman Portrait Busts, Republican and Imperial

Nodelman argues that the unforgiving portrait style was strongly tied to Roman Republican values, while the idealizing mode was a mark of Julian, Imperial values. He writes

Through emphasis on the marks of age, these men call attention to their long service to the state and their faithfulness to constitutional procedures, in intended contrast to the meteoric careers and dubious methods of the individualistic faction-leaders– men like Marius and Sulla, Pompey and Caesar, later Antony and Octavian– whose ambitions and rivalries in the quest for personal power were rending the fabric of the republic.

The notion here is that the naturalism of the Roman Republican portrait suggests the battle-scarred character and immanent service of the person thus portrayed, while the idealism of the Roman Imperial portrait hints at the superhuman character and transcendent origin of the person shown. Age well earned stands in contrast to perpetual, effortless youth.

We’re a far cry from ancient Rome, of course. Neither the cultural conventions nor the political circumstances make a ready match with the United States of the early 21st century. Still, I couldn’t help but wonder whether some such ideological mapping makes sense in the current political season’s wash of portrait imagery.

Among the currently viable candidates for the presidency, only McCain directly thematizes his worn and weathered condition. His self-description as more scarred than Frankenstein[’s monster] suggests a connection to values such as those the pre-imperial Roman elite chose to emphasize. In contrast, much has been made of Romney’s corporate polish, and the candidate himself has emphasized jokingly the importance of not mussing his carefully sprayed hair. On the Republican side, then, the McCain/Romney competition might be understood to break out on lines analogous to those that Nodelman defines.

If, on the Democratic side, Obama seems to fit the role of young, polished, and glowing, perhaps an emphasis on the somewhat wizened Hillary rather than the airbrushed, pore-free Hillary would serve well her goal of drawing a rich contrast.