Saturday, July 12, 2014

Doc Savage: His Apocalyptic Afterlife (and Somewhat Problematic Comic Book)



This continuous traveling through a Savage land enabled me to see what I might otherwise have missed. The Savage supersagas are apocalyptic.
Philip José Farmer, Doc Savage: His Apocalyptic Life


I have already commented on the first two issues of Dynamite’s Doc Savage adaptation by Chris Roberson and Bilquis Evely, in which I questioned whether the property could be adapted to any other medium; now I have seven of the eight issues in hand, and I will comment on this particular effort as constructively as I can. I will start with the covers by Alex Ross, which emulate those of James Bama, one of the great paperback illustrators of all time, whose covers graced most of the Doc Savage covers through the late 70s, as well as countless westerns, and at least the first memorable James Blish Star Trek adaptation. I will then proceed to the story and interior art.

For the most part, Ross’s Bamaesque painted covers hew to the James Bama formula for the Bantam paperbacks of the 60s and 70s. Stylistically, they are all recognizably rendered in Ross’s Ross’s trademark watercolor technique, which comes as close as humanly possible to mimicking Bama’s oils (perhaps a 9 out of 10); had they been reduced to paperback size the effect might have been greater still. As it is, however, at comic book size there is a certain roughness and sharpness to the technique that cannot be overcome (watercolor demands a certain spontaneity that cannot be overworked), and so they look like watercolors trying to be oil paintings. Compositionally, the better covers (that is to say, those that most effectively emulate Bama) show the single figure of Doc facing some seemingly insurmountable menace.

Among the weaker covers, from the standpoint of evoking Bama, are covers #1 and #8, which compress time in a mosaic of images, or depart from the Bama formula in some other way. The second cover, one of the stronger ones, completely fetishizes Doc’s trademark torn shirt, transforming it into a swirling, flame-like maelstrom. Here Ross is at his most insightful, as he makes explicit in almost parodically the underlying metaphorical function it served in so many Bama covers, not as a literal shredded garment, utterly superfluous in its failure to protect or to conceal, but as a motif of lightning-like energy clinging to a superbly well-developed physique. The third cover shows a nuclear explosion, an apocalyptic situation about which Doc can do little, and therefore a bit more fatalistic than the Bama formula would ever allow. The fourth shows Doc carrying a young Brit punk to safety from a burning field of oil wells while splashing through puddles of spilt crude, the generational juxtaposition presumably providing the interest, but coming off more like a typical Don Pendleton Executioner cover. The fifth cover features Doc in a Sterankoesque pose as a Skylab-like orbiting satellite destroys the earth as if it were Krypton (again, like the nuke cover, a theme that would have been a little too fatalistic for a Bama cover). The seventh cover successfully evokes the cool color schemes often done to such success by Bama, but features a rather a weak crowd composition that is reminiscent of some of the weaker covers that graced the Bantam paperbacks by either Bama or other artists.

The sixth cover, however, is clearly the most iconic of the Dynamite series, and perhaps one of the most arresting Doc Savage images ever created by any artist. It certainly ranks as the most memorable of any outside the Bama canon, and outdoes a number of Bama Savage covers as well. It is a metaphoric contemplation of Doc Savage facing a situation clearly distilled from 9/11, showing one horrific aspect of that event as nine airliners nosedive out of the skies at once, Doc powerless to save them. Fatalistic, yes, but not completely apocalyptic, and perhaps summing up the theme of the entire series.
 
Alex Ross, cover to Doc Savage #6, perhaps the most iconic image of the adventurer ever created.

[The alternate covers, most of which presumably are intended to evoke the various comic book iterations of the property, are not as successful, in my humble estimation. I haven’t purchased any of them and I won’t comment on them any further. Sorry to be so dismissive, but them’s the breaks.]

As for the story itself, in contradistinction to the covers, ironically Doc is almost never alone to face or solve a problem by himself. From the beginning, the emphasis is on the team. Just as The West Wing served as a narrative antidote to all those presidential histories in which one lone figure is the main protagonist, this Doc Savage seems bent on showing how reliant Clark Savage, Jr. is on his teams of experts, from the original Fabulous Five to the progressively younger and more racially, ethnically, culturally, and genderally diverse and numerous aides that replace them as they age, wear out, and (off stage, as it were) quietly pass away. As things progress, even these nominally-individualized characters (each is given a suitably corny nickname in the tradition of Monk, Ham, Long Tom, et al, but only perhaps the young Brit punk is more that one-dimensional) give way to impersonal cubicled call centers with 1-800 numbers and armies of anonymous analysts and coders, and finally to an automated smart-phone network susceptible to meddling. In fact, the general theme of the story would seem to be little more than a demonstration of how the world has become a more complicated place since the Street and Smith pulps came to an end, and more explicitly about how the scientific and technological systems put in place by Doc, as well as his moral philosophy, can by hi-jacked when put on auto-pilot.

This conception seems to owe something to Alan Moore’s Watchmen, which featured a Doc-like Ozymandias depicted as a bureaucratic capitalist presiding over an international corporation, who loses his moral perspective as the business structures he has built ostensibly to solve the world’s problems become more complex and unmanageable. In fact, Chris Roberson’s Doc only seems to appear in scenes in which he can moralize and defend his questionable practices, such as the secret Crime College, where criminals are medically “cured” of criminality through a deft brain incision, and Doc’s general practice of working on his scientific breakthroughs in secret and keeping them to himself. In other words, even as Doc’s security network becomes more corporate and bureaucratic, his intellectual property becomes increasingly proprietary, with disastrous results. A major plot element concerns the secret serum that Doc perfects that essentially makes him immortal, but is lost before it can benefit the world.

It is worth pointing out that both the immortality serum and the moral implications of the Crime College are ideas borrowed from Philip José Farmer, who suggests them in his pseudo-biography of Doc, Doc Savage: His Apocalyptic Life. Thus Roberson to a great extent is elaborating on a universe outlined by Farmer as much or more than Kenneth Robeson, the original architect of Doc’s adventures.

It is perhaps less useful to point out that such ambitious themes as Roberson seems to have in mind might have been better explored in prose than in comic book form, and perhaps more freely in a satire such as Farmer did on several occasions with his more obsessively sexualized Doc Caliban (particularly in A Feast Unknown), or as Moore does rather succinctly with Ozymandias. In fact, given the “quick-read” mode of the day compared to more densely-packed comics of past eras, the entire series so far reads rather like a series of truncated scenes from one of the supersagas (Farmer’s term) than one of the supersagas themselves, and not even particularly pulse-pounding highlights from an average one. The pulp form, after all, is nothing if not one break-neck cliff-hangar after another that in retrospect bears little logical scrutiny, while the informational and action through-put of comics these days is little more than a smoke signal. While the series maintains readerly interest, little of the visceral, no-holds-barred pulp spirit of the original stories is in evidence. If one can imagine Doc’s hypothetical career since 1949 as being even half as rich as his monthly exploits of the 30s and 40s, one could certainly imagine distilling a richer and more exciting comic book therefrom. Instead, the Dynamite Doc Savage is a rather plodding, often slowly-paced, and above all a hyper-conscious cerebral exercise that reads more like a rather dry storyboard for what could be a more interesting feature film, than a comic book. One might wish at least for a text page per issue musing at length on some of these themes, and at least some historical background on the property to serve as introduction for new readers and reminder to some of us old-timers who may not have read an actual Savage in quite awhile.

What Roberson seems to have in mind instead is a meta-narrative of sorts that not so much adds onto or adapts the Doc Savage supersagas as takes a step back from it to contemplate the more philosophical aspects of the superman-in-the-modern-metropolis theme, whose own hallowed belief in inexorable progress becomes the ultimate evil and whose adversaries are less and less freelance madmen bent on taking over the world and increasingly former aides who lose faith in Doc and his principles and turn traitor. Doc’s righteous crusade instead of bringing the world to salvation instead promulgates a self-fulfilling prophecy and induces a self-inflicted apocalypse (although we’ll have to wait for the final issue for the outcome). Again, these are great themes that could be better explored in prose, but given the problematics of licensing the Doc Savage property and the marketing prospects of publishing further text novels in that series, it is likely that such a philosophically-tinged prose project would be unfeasible, and a comic book adaptation that wants to suggest a movie treatment is the best we can hope for.

Finally, the art of Bilquis Evely, which I commented on previously and which seems more progressively likeable. I have sympathy for the task she faces, evoking several periods of style and architecture, from 1933 to the present. Her Doc paradoxically never rips his shirt, although he looks as though he’s about to burst out of his suit on several occasions, particularly when he addresses JFK’s cabinet. One gets the impression that she would rather draw strapping, mostly-naked superheroes (as would we all) rather than pedestrian fashions, quotidian props, and faithful portraits of famous buildings. Many of her panel and page compositions seem static, owing to the eye-level camera angles and vertical postures of most of her figures, and she would do well to revisit John Buscema’s How to Draw Comics the Marvel Way (the visual codification of the break-neck pulp prose style). Her Doc is hardly dynamic let alone apocalyptic, but as a first professional effort as this reportedly is, the Dynamite Doc Savage: The Man of Bronze is a respectable accomplishment.

Monday, July 7, 2014

The Fig Leaf of Cognitive Training: Navigating Our Mediated World by Conforming to Contemporaneity

In response to “The Crisis in Art History,” which I cited in my previous post, Amy K. Hamlin and Karen J. Leader, in “Art History That! A Manifesto for the Future of a Discipline,” characterize art historians as having “Highly developed visual discernment, a deep knowledge of history, [and] a nuanced understanding of cultural heritage.” They assert, “we need art historians because they are equipped to teach the skills urgently required by twenty-first century citizens to navigate the complexities of a visually driven information age.” The last point, that the study of art history is vital to navigate our mediated world, is related to the rationale I heard in my old department: that art history, unique among college offerings, uniquely offers an irreplaceable training in visual analysis. Any criticism of the curriculum, pedagogy, or methodologies currently trending in art history, for Hamlin and Leader, are attacks on the humanistic development of critical cognitive faculties; the real problem, as they see it, is rather the exorbitant expense acquiring a college education. [1]

This is a disingenuous argument for two reasons. First, while art historians are equipped to teach some of the skills required for citizenship in the twenty-first century, they are neither uniquely equipped or even the first choice for doing so. If we seriously want to prepare individuals to navigate modern visual media, if not inoculate them to the more subtle forms of visual manipulation deployed by advertising, political campaigns, and visceral entertainment, what is called for would be a practical and theoretical course in film editing and theory, and it would be mandatory in every undergraduate curriculum.

For starters, the Odessa Steps sequence from The Battleship Potemkin would be analyzed frame by frame (preferably in an old Moviola), and students would have the opportunity to edit their own footage (and tell their own truths or fabrications) in Adobe Premiere. The theoretical writings of Vsevolod Pudovkin and Sergei Eisenstein would be read and debated (preferably vehemently, in a café), and Indiana Jones and the Temple of Doom, one of the most brilliantly edited movies I can think of, would be dissected for its fluid technical mastery and somewhat crude cultural and ideological assumptions. Static images would be studied as well, particularly in their juxtaposition as storyboards or comic strips. (I am, of course, describing my own early training in the study of comic book storytelling, except that I viewed the Odessa Steps on an 8mm film viewer, and spliced together a few shots together on 16mm with pieces of adhesive tape, the old fashioned way.) Still life and figure drawing would be optional but strongly encouraged, as well as basics of photography (composition, lighting).

To this course of study, art history could perhaps be offered as an ancillary curriculum for those wishing to explore the ways in which manual images were made prior to the advent and inexorable conquest of photography and cinematography over the past 175 years, as well as studio courses for those wishing to master manual image making (figurative drawing, painting, sculpture) for themselves (filmmakers such as Eisenstein and Fellini, and art theorists such as Meyer Schapiro were all skillful and gifted artists in their own right, practices that informed their work).

Donald Simpson (American, b. 1961), Light Up! (With Apologies to Tony Smith), 1979. Photo-mechanical transfer on photo paper from found clip art, 8 1/2" x 11". Collection of the artist. © Donald Simpson, all rights reserved.


To another of Hamlin and Leader’s points, that art historians having a “Highly developed visual discernment, a deep knowledge of history, [and] a nuanced understanding of cultural heritage,” Patricia Mainardi and Pepe Karmel, in “The Crisis in Art History,” already dispute that. With the surge of contemporary art study threatening to overtake that of “historical art” (i.e., precontemporary art), the authors see an increasing neglect of historical study and a cheapening of the art history curriculum. Mainardi laments, “the vast amounts of wealth now moving through the world of contemporary art, in museums and auction houses, galleries, and international art fairs” are seducing art history students away from “the libraries and archives of previous generations.” She notes, “Wherever contemporary art studies have become dominant, the same results are apparent.” Students no longer study “the art of different periods and cultures,” but instead focus on the art of the twenty-first century, and almost exclusively on texts written in English.[2] Karmel notes that the average time to complete a dissertation in art history overall is 4.2 years; for premodern topics, the average is 5.5 to 6.3 years; modern art, 3.9 years; but for contemporary only 2.6 years. Karmel remarks,
You interview the artist a few times, you persuade the artist’s gallery to let you see their files and their photo archive (the real-world equivalent of a catalogue raisonné), you read the published criticism, you follow up on the artist’s remarks about texts and ideas that influenced him or her. Then you sit down and write. The resulting text may be very good. It may become a terrific book or exhibition catalog. But it simply is not the same thing as a PhD dissertation in other fields of art history. And the degree it earns should not be a PhD.[3]
What Hamlin and Leader’s (and my old department’s) defense of art history ignores is the overwhelming expenditure of energy, not on training students to navigate our mediated world or even to visually analyze right-wing print propaganda, but on genuflecting before all-powerful art world institutions (including the academic discipline of art history itself). Why would one-of-a-kind treasured works roped off in a museum, or a Jeff Koons guarded by bouncers at London Frieze,[4] best serve as examples of visual phenomena for such study anyway? Art history involves all kinds of fascinating side trips into aesthetic theory, the chemical analysis of pigments, and internecine doctrinal fights, but very little of this is of any practical use to the college student trying to make critical sense of the media barrage emanating from her smart phone. As I said, justifying art history on such grounds is tantamount to advocating the study of the history of world religions as the surest remedy for high blood pressure since we all need a quiet, meditative break from the frenzy of our lives now and then. It is absurd.

Far from a useful training in the navigation of our highly mediated world, art history is currently little more than an indoctrination into the current world of art. It is crucial to make this explicit as the wealth of that art world increasingly seduces and obtains a stranglehold on academic programs, away from the study of what Mainardi calls “historical art” to contemporary product, of which we are urged to “think historically” as Terry Smith puts in is ubiquitous writings on contemporary art.[5] Of course, it is not impossible to consider the present from an historical perspective. Indeed, an historic sensibility is desirable; hence the study of history. However, rendering pseudo-art historical judgments on what is valuable in our present visual culture, judgments that are immediately ratified and reified by institutions with the power, authority, and economic clout to makes such choices forever fixed and unalterable by later generations (by the inclusion of certain works in public exhibitions if not permanent collections and in textbooks) is not a historical process at all. Such complicity in contemporaneity is not a critical function but corruption itself. It is scholarship shilling for the current art world.

From this view, the promise of cognitive training can only serve as a cynical fig leaf to what is really going on in art history programs today: the spread of conformity and complicity in the pseudo-cultural machinations of capital.

Notes
[1] Amy K. Hamlin and Karen J. Leader, “Art History That! A Manifesto for the Future of a Discipline,” Visual Resources: An International Journal of Documentation, vol. 30, no. 2, pp. 138-144; quote p. 139.
[2] Patricia Mainardi, “Art History: “Research that ‘Matters’”? (pp. 305-307) in Patricia Mainardi, “The Crisis in Art History,” Visual Resources: An International Journal of Documentation, vol. 27, no. 4 (December 2011), pp. 303-343; quote p. 306.
[3] Pepe Karmel, “Just What Is It That Makes Contemporary Art So Different, So Appealing?” (pp. 318-327) in Patricia Mainardi, “The Crisis in Art History,” Visual Resources: An International Journal of Documentation, vol. 27, no. 4 (December 2011), pp. 303-343; quote p. 326.
[4] See A.A. Gill, “Frieze Until the Numbness Sets In,” Vanity Fair, January 2014, pp. 44-45; p. 45.
[5] Terry Smith, “Contemporaneity in the History of Art: A Clark Workshop 2009, Summaries of Papers and Notes on Discussions,” Contemporaneity: Historical Presence in Visual Culture [http://contemporaneity.pitt.edu], vol 1 (2011), p. 13.

Saturday, July 5, 2014

Dead End: Why Art History is No Longer (and Perhaps Never Was) an Academic Discipline

When I decided to stop my adult life and return to college, I did so for many reasons, including the desire to widen my career options and a more general desire to satisfy my intellectual curiosity. I chose the history of art and architecture as my undergraduate major and subsequent area of graduate research specifically to answer a number of immediate questions I harbored as a life-long artist. On social media not too long ago, I remarked that I deeply regretted that decision from a professional as well as intellectual standpoint, for, while I had diligently finished what I started and despite having satisfied my intellectual curiosity in regard to particular questions pertaining to art early on, I found the discipline dogmatic, constrained, and for all practical purposes exhausted (art history moreso than architectural history, although for all intents and purposes my dissertation was in urban planning history), and wished that I had taken a more general course of study such as history, urban policy and planning studies, languages and literatures, or philosophy. The grass may not be any greener in those disciplines, but after eleven years of mid-life college, I felt entitled to indulge briefly in a bit of buyer’s remorse (although one of my advisors regarded my remarks as a personal and permanent betrayal); in any case the lawns certainly seem wider. I am of the growing conviction that art history as an academic discipline is a completely exhausted field of study that for all intents and purposes could be hermetically sealed, requiring only a few caretakers to tend to the classified, archived, and salted away extant body of knowledge.

In one interdisciplinary conference I attended just prior to receiving my doctorate, it was claimed that what made art history an indispensable academic discipline was its unique emphasis on “visual analysis,” learning to describe objects verbally, and elicit increasingly probing questions about their facture and purpose, characterized as a vital and necessary skill in our increasingly mediated world. This I thought rather weak tea, a feeble rationale accounting for only a miniscule portion of the skills demanded by the field, and hardly a convincing justification for the immense energies expended on grasping theories, styles, and archives of key works a (what used to be called canons). Besides, the same skills can be acquired in English 102 by describing a dried leaf, to say nothing of film or media studies. It is analogous to claiming that the study of the history of world religions is justified because after all we all can all use a quiet moment of prayerful meditation now and then in this stressful, frenzied world. In other words, on that score, there is nothing about the study of art history that is not shared by many other academic disciplines.

Art history, as I have found it to be practiced, is a narrow and constricted discipline, an academic ghetto. Art history is to history, to paraphrase Mark Twain, what a lightning bug is to lightning. Although there are many histories and manifold interpretations thereof, there is only one art history, dogmatically dispensed and disciplinarily policed. The general narrative proceeds teleologically from the Venus of Willendorf and the caves of Lascaux to the Acropolis and the Sistine ceiling, ending with elephant dung paintings and a shark in a tank of formaldehyde. It is a story that, to say the least, does not have a happy ending.

Donald Simpson (American, b. 1961), Still Life with Bust of Venus de Milo from Pier One Imports, 2007. Charcoal on paper, 18" x 22". Collection of the artist.

Miss Helen Clay Frick, the American art historian and founder of the department in which I earned by BA, MA, and PhD in history of art and architecture, believed art history to have come to an end by the Civil War, or at least that there had been little art produced since that time worth serious study. The mid-nineteenth century date for the end of art history is not far from a general consensus among figures as diverse as Hegel and Hans Sedlmeyer, whom, for different reasons, regarded art as having disappeared some time since the middle ages. But one need not be an avowed anti-modernist like Miss Frick; the consensus even of modernists is that a certain tradition of art came to an end in the nineteenth century with the dawn of modernism. This admittedly loose periodization comports with a general view that in the nineteenth century, with the advent of the public art museum and the formation of the academic discipline of art history, art history itself, paradoxically, had come to an end.

Art, as it were, having become historically conscious of itself by collecting and studying its own past, at the same was inherently incapable of collecting and judging the art of the present, or of adding any of it to art history, at least with the same authority as the unfolding of history itself. To claim certain works of living artists as of historical importance without the passage of time as proof of enduring value, it was clear, would have been to pick winners and losers with the imprimatur of history, a de facto illegitimate procedure. Museums and art history therefore created “museums of living artists,” quarantined holding tanks for new art, so that this new work could be viewed and appreciated by the public but also so that the final verdict of art history (inclusion in the encyclopedic museum) could be postponed at least until an artist’s death. Until at least that much time had passed, the jury was considered still out.[1]

This self-imposed restriction on including living artists in the newly-forming encyclopedic art museum was short-lived, lasting little more than a generation or two; but it was sufficient time to create a permanent rupture in art history. Paradoxically, anti-modernist collectors such as Henry Clay Frick, his daughter Helen Clay Frick, Samuel P. Huntington, and J. Pierpont Morgan, whose private collections became the founding permanent collections in a network of museums in the United States, essentially starved out a generation of traditional figurative painters by denying them commissions while sinking millions into Old Masters, antiques, and rare manuscripts. This perverse neglect, along with the inexorable conquest of photography, removed any incentive for artists to master such representational skills as perspective and anatomy. John White Alexander, a painter once as famous as Sargent, spent the final 15 years of his life on a mural that, while on public view to this day, languishes in art historical obscurity: The Apotheosis of Pittsburgh, for the Carnegie Institute in Pittsburgh. For this monumental masterpiece, Alexander received $175,000, a large sum for a living artist, to be sure, but a mere pittance approximating the amount a Frick or a Morgan habitually spent on an Houdon bust or a Gobelin-manufactured piece of Louis XIV furniture, which they in fact bought by the carload, sometimes from each other.[2] Artists with representational skills and inclinations naturally migrated to where the money was: illustration and other forms of commercial art; galleries that served collectors and the museums they fed were surrendered to the avant-garde. With friends of traditional art like Frick and Morgan, who needed enemies? Anti-modernism produced its opposite, modernism, almost without any help. In any case, academic art history faced a choice: follow classically-trained figurative artists into new media (print) with its alien editorial-advertising model of patronage, or remain with the easel painters and the familiar system of galleries and collectors (and ultimately museum patrons). And the changing stylistic tastes? Progressivism.

By the mid-twentieth century, museums had gotten over their reluctance and began adding works of art from the late nineteenth century to the present into their collections and narratives, only somewhat belatedly incorporating modernism into art history. This work was all of a certain ideological character, namely socially progressive in terms of content or avant garde in terms of form, and its inclusion was strictly on the basis of adding something to the constructed narrative of art history that had never been seen before, like adding newly discovered atoms to the table of elements. What is significant about this move, as Boris Groys has pointed out, is not so much what was added to the art historical narrative but what was excluded: works that were deemed visually repetitive in that they carried on traditional representational practices and/or were created for a popular audience or served a commercial purpose. Groys calls this the “museum taboo”: after the mid-nineteenth formation of the museum, new additions to art history could not look like anything that had come before. A new work had to be something apparently different, novel, that in some way expanded the meaning of, or our understanding of, art. Art history is thus reduced to a chronology of scientific breakthroughs, or as Gabriel Tarde put it, “history above all is a record of inventions.”[3] But this is to make of art history nothing more than an inventory of neologisms compiled in a dictionary quite for their own sake, regardless of whether or not anyone ever puts these new expressions or idioms to practical use. Indeed, as Groys formulates it (although he does necessarily endorse this view), the museum taboo virtually prohibits artists from adopting the language or style of another artist.[4] Once something has been invented, like penicillin, the invention cannot be repeated. But while penicillin is still manufactured and in use, who can possibly drip after Pollock?

If such a taboo of apparent repetition, of art not being allowed to “look” like previous art, were extended back in time, it would eliminate much of the art prior to modernism. The Renaissance, the Baroque, and neoclassicism, to say nothing of the classicizing tendency of later Bauhaus architecture, are all reiterations or reinterpretations of classical antiquity that on some level “look” like ancient art. To eliminate works such as Soufflot’s Ste. Geneviève, a work that consciously tried to “look” Greco-Roman, from the canon of works deemed worthy of art historical study would be to ignore how Soufflot sought to outdo the Greeks and Romans in terms of structural engineering and scale. In other words, there is always more going on in art than meets the eye, and the exclusion of decades of representational work from art history on the grounds that it “looks” like the art of the past is more than an irrational taboo; it is intellectual laziness.

But this is only one instance of the double-standard that pertains to art created since the advent of art history. Another example would be the grounds upon which commercial illustration has traditionally been excluded from the museum: because it was not created with the gallery wall in mind, but rather for reproduction on a printing press, and not for the delectation of an elite audience, but a broad public.[5] This denies the fact that Norman Rockwell, trained as a painter along with legitimate “gallery” artists of his generation and an assiduous museum-goer, certainly was acutely aware of the gallery wall at his easel, whether he was painting a Saturday Evening Post cover or a coffee advertisement, and nursed a barely-concealed ambition that at least some portion of his work would one day grace the gallery wall, while a great many artists of the past, such as icon painters, never had the least intention that their sacred works would ever be exhibited as purely aesthetic objects in a profanely secular space. Indeed, most modern and contemporary work that has been included in art history has the distinction of having been explicitly intended for the gallery wall. If works not so intended were to be expelled from art history, major museums around the world would have to deaccession much of their holdings and sit emptied and bereft of sizeable portions of their permanent collections.

There is no greater divide in art history than that marked by the rise of art history itself. Premodern art, the only kind thought valuable by Miss Frick, is held to an altogether different set of standards than art since the late nineteenth century, the kind of work that is implicitly subject to Groys’ museum taboo. From this view, premodern or what might be termed precritical art forms a sort of primordial unconscious to the more acutely self-conscious modern and contemporary period. Modern and contemporary art is nothing if not conscious (and critical) of itself and previous art history, positioning itself against the past or freely (and usually without a trace of cleverness) appropriating it. Anti-modernists like Frick saw art history coming to an end with the advent of modernism, while modern and contemporary theorists see art history beginning with the same moment of rupture.

For Arthur C. Danto, modernism and contemporaneity are the two eras surrounding “the end of art,” distinguished by their attitude toward the premodern art of the past. For Danto, modernism is characterized by celebrate “a repudiation of the art of the past,” while “Contemporary art, by contrast, has no brief against the art of the past, no sense that the past is something from which liberation must be won, no sense even that it is at all different as art from modern art generally. It is part of what defines contemporary art that the art of the past is available for such use as artists care to give it.”[6] To use a theological metaphor, modernism is the Old Testament and contemporaneity is the New Testament in a new dispensation that has transformed the current art history Master Narrative. Within this dogma slight denominational and doctrinal differences can occur, but no great deviations of dissent or heresy.

To return to the example of commercial illustration, art that persists in “looking” like the art of the past, e.g., representational art, is relegated to Visual Culture Studies, and art historians who choose a topic like the posters of Alphonse Mucha are not so much permitted to pursue such research as discouraged to pursue it, in that they are encumbered by the additional superfluous methodologies pertaining to visual culture. That art historians cannot simply consider such material as a part of art history with methodologies acquired by the study of premodern art demonstrates how the boundaries of the discipline are so thoroughly ideologically policed.[7]

The problem is not what is included in art history (the elephant dung paintings, the shark in the tank of formaldehyde) so much as what is excluded: mountains of creative visual material that do not suit a preordained set of ideological assumptions and scholarly methodologies. As Raymond Williams writes,
There is great danger in the assumption that art serves only on the frontiers of knowledge. It serves on those frontiers, particularly in disturbed and rapidly changing societies. Yet it serves, also, at the very center of societies. It is often through the art that the society expresses its sense of being a society. The artist, in this case, is not the lonely explorer, but the voice of his community. Even in our own complex society, certain artists seem near the center of common experience while others seem out on the frontiers, and it would be wrong to assume that this difference is the difference between ‘mediocre art’ and ‘great art.’
For Williams, the notion that “ creative’ equals ‘new’ […] is a really disabling idea, in that it forces the exclusion of a large amount of art which it is clearly our business to understand.”[8]

In such a case, as someone once said (I think it was Mark Kingwell), art history becomes little more than a chronological listing of works whose sole interest lies in the fact that at one time they were considered authentically modern. One has to explain to the undergraduate that Duchamps’ urinal or Joseph Kosuth’s One and Three Chairs was once cutting edge, and Andrew Loomis was not an “artist” because his work was published in magazines, all of which is dutifully noted without question (other than, “Is this going to be on the test?”).

A substantial survey in the journal Visual Resources entitled “The Crisis in Art History,” paints an even more dire picture. Patricia Mainardi points out “the skewing of academic art history and museum exhibitions toward contemporary and away from historical art,” and that “The tail of contemporary art is now wagging the dog of art history,” resulting in a neglect of archival research in favor of superficial gallery-hopping. Pepe Karmel reports that “Quietly but rapidly, there has been a broad loss of interest in older art—meaning art made before 1980.” Both Mainardi and Karmel note the huge amounts of money and students gravitating toward the study of contemporary art, to the neglect of “historical art” and its methodologies, resulting in the loss of a sense of history as well as tenured positions in pre-contemporary art, and a general dilution and cheapening, if not dumbing down, of the discipline. Still, Karmel sees little choice but for the discipline to increasingly serve this growing market.
It seems likely that, in years to come, there will be more and more money available for the study of contemporary art, and less and less for the study of everything else,” but if art history does not service this market “another department will.” Still, Karmel argues forcefully that the study of contemporary art should be relegated to a certificate in which “There would be a capstone project requiring research and writing on a particular artist or movement. However, this would not be a doctoral dissertation, and the resulting degree would be a certificate in contemporary art, not a PhD. Such a degree would not qualify graduates to teach at a university level.[9]
But even if the field of art history were to eschew contemporary art for a return to hardcore “historical art,” it begs the question as to how it can be justified as an autonomous academic discipline.

In his early (1939) essay “The History of Art as a Humanistic Discipline,” Erwin Panofsky describes the work of the art historian as a fusion of rational archeology and sensitive connoisseurship. Or as he puts it, “the art historian subjects his ‘material’ to a rational archaeological analysis […] but he constitutes his ‘material’ by means of an intuitive aesthetic re-creation,” rather like “a loquacious connoisseur.” In other words, the art historian “constitutes his object” of art historical study by first recognizing in it its “demand to be experienced aesthetically.”[10] (Panofsky recognized that every art historian may be limited in terms of aesthetic sensitivity by experience and “cultural equipment,” but that this can be broadened by erudition.) As an ideal, this is perfectly plausible—although no example is given to illustrate, one imagines an art historian going out into the world, digging up a find from a tomb or rummaging through an attic, and recognizing a work of art worth that both arrests her aesthetic sensibility and demands further documentary investigation.

However, even in 1939, when Panofsky was writing, it was becoming increasingly unlikely that many hitherto undifferentiated objects awaited in the natural world for art historians, uniquely qualified by virtue of their sensitivity and training, to come along and declare them works of art. This is even moreso in the twenty-first century, especially for the undergraduate student in a typical survey course, in which any objects or works to come under discussion have already been declared works of art by virtue of having hung in museums for decades or having been included in the latest contemporary art texts. The material of art history already comes pre-constituted, as it were, under what Jonathan Culler calls “the hyper-protected cooperative principle.” In simple communication, we assume that our interlocutors are trying to cooperate with us, that is, make sense, even if we don’t immediately understand their vocabulary or idiomatic phrasing. In the case of literary texts, especially obscure or difficult ones, the assumption is that they are worth study, if only by virtue of being on a course reading list.[11] When a work of art comes to us already in an art history textbook, a museum, or other consecrated artworld space such as an art fair, biennial, or remote location (e.g. Robert Smithson’s Spiral Jetty or James Turrell’s Roden Crater), Panofsky’s aesthetic recreation never comes into play; indeed, aesthetic sensitivity is not required at all. Art history is merely a form of archival (in the case of historic art) or gallery- or biennial-going (in the case of contemporary art) archeology.

This is crucial particularly in the area of contemporary art, where aesthetic criteria are not even a consideration, and the status of the work of art as such is beyond question. How is one to even know if the object differentiated from the natural word is even a work of art? The answer is one cannot, and the student who studies this area has no recourse but to simply accept the dogma of their professors and their textbook materials. As Mark Kingwell writes, “Art is simply whatever the art world talks about.”[12] Under such dogma, the study of contemporary art operates not so much under a hyper-protected cooperative principle, but a hyper-protected conformity principle.

Like the evaporation of the American frontier in 1890, art history is a closed book.

[More on "The Crisis in Art History" here.]

[1] For the firewalling of living artists from Old Masters in a kind of farm-club system of museums in Paris in the nineteenth century, see Terry Smith, What is Contemporary Art? (University of Chicago, 2009), pp. 39-40.
[2] On the mania for Old Masters and the formation of collections that served as the basis for several large public museums in the U.S., see Meryle Secrest, Duveen: A Life in Art (University of Chicago Press, 2004).
[3] Quoted in Jean-Philippe Antoine, “The History of the Contemporary is Now!” Contemporary Art: 1989 to the Present, ed. Alexander Dumbadze and Suzanne Hudson (Wiley-Blackwell, 2013), p. 32.
[4] On the “museum taboo,” see Boris Groys, “On the New,” RES: Anthropology and Aesthetics, no. 38 (Autumn 2000), pp. 5-17, reprinted with minor modifications in Boris Groys, Art Power (Cambridge MA: MIT Press, 2008), pp. 23-42.
[5] On the unsuccessful campaign to persuade the Metropolitan Museum of Art to collect and exhibit illustration art, see Michele H. Bogart, Artists, Advertising, and the Borders of Art (University of Chicago Press, 1995), pp. 43-47.
[6] Arthur C. Danto, After the End of Art: Contemporary Art and the Pale of History (Princeton University Press, 1997), p. 5.
[7] See a previous blog post.
[8] Raymond Williams, “The Creative Mind,” The Long Revolution (Columbia University Press, 1961), pp.
[9] See Patricia Mainardi, “Art History: “Research that ‘Matters’”? (pp. 305-307) and Pepe Karmel, “Just What Is It That Makes Contemporary Art So Different, So Appealing?” (pp. 318-327) in Patricia Mainardi, “The Crisis in Art History,” Visual Resources, vol. 27, no. 4 (December 2011), pp. 303-343; quotes from pp. 305, 306, 320, 323, and 326.
[10] Erwin Panofsky, “The History of Art as a Humanistic Discipline,” in Meaning in the Visual Arts (Doubleday Anchor, 1955), pp. 1-25; quotes from pp. 14, 20, 16 and 12, respectively.
[11] See Jonathan Culler, Literary Theory: A Very Short Introduction (Oxford University Press, 1997), pp. 25-26.
[12] Mark Kingwell, “Art Will Eat Itself,” Harper’s (August 2003), pp. 80-85; quote p. 82.

Thursday, July 3, 2014

Independence Day: Celebrating Non-Conformity

Names have been omitted to maintain an aura of confidentiality. If you can connect the dots, you know too much. Plausible deniability is truth. If the shoe fits, for God's sake, don't continue running around barefoot over sharp tacks.

At the beginning of this past spring break, I posted a series of remarks on social media about some of my quite recent graduate school and college teaching experiences, as well as some general observations on academia and my chosen discipline. By their very nature, these random and in some cases nearly incoherent remarks neglected to dwell on the many wonderful and positive experiences I have enjoyed over more than a decade of college, and my deep appreciation and gratitude for the opportunity—and amounted to little more than letting off a portion of the steam that had built up over various irritations and perceived injustices I felt during that period of time. Initially, I had only intended to make a single snarky quip or two concerning a recent development that had stuck in my craw; but one remark led to another, and another, and another, and within no time I had compiled myself quite a little diatribe. Since no more than a handful of social media acquaintances (none from my immediate academic environment) had offered their comments on this thread, I convinced myself that the conversation had remained private and of no interest to anyone besides those who had directly participated. In any case my remarks would have made little sense to few outside of an immediate workplace circle, since after all no names had been used, and the situations described could have only been recognized by a handful of coworkers (and perhaps in the abstract by a few outsiders who were in some way acquainted with analogous stresses and irritations of university life). Still, in reading back the postings the following day, I decided that in their rough, stream-of-consciousness form, replete with certain rhetorical exaggerations and more than a few unkind characterizations of the name-withheld variety, were not fit to be left dangling in cyberspace, so I completely deleted them. Having successfully purged myself of a good bit of pent-up negative energy, I promptly forgot the entire incident and enjoyed the rest of my spring break, relaxing and preparing for the final month of school. No harm done—or so I thought.

An irrelevant cartoon from seventeen years ago (my lucky number).


Alas—the following Monday, much to my horror, I learned that some person or persons (whose identity remains both completely unknown and utterly irrelevant to me) had observed the thread, cut, pasted and converted it into a PDF, and circulated it (reportedly) to “everyone” in my department. My remarks, in other words, had gone “viral” among an inconceivably small and inbred group that included friends, colleagues, advisors, and even a teacher’s aide, few of whom hitherto had ever so much as “liked” any of my other social media postings, and none of whom apparently held me in high enough regard to tip me off that my remarks had become the subject of departmental scrutiny and discussion. Imagine Martin Luther’s embarrassment had his rough and nearly incoherent notes for the ninety-nine theses been leaked before he could realize a more refined, final draft and you’ll understand something of my chagrin on stylistic grounds alone. But of course, the story does not end there.

As classes resumed the following week, I was summoned to the department office to be called on the carpet. My ersatz social media musings—which had been made virtually, off-campus, on my own computer and utilizing my own internet connection—had been deemed, quite arbitrarily, an appropriate and material workplace issue (this after a long history of ignoring complaints I had made concerning actual, non-virtual workplace behaviors—observed with my own eyes and experienced first-hand). Options such as mandatorily-sentenced therapy, the withholding of future letters of recommendation (just as I was beginning my crucial post-departmental job search), the launching of personal defamation lawsuits, and even summary firing were all discussed matter-of-factly as very real possibilities, as though any or all of the above (or threats of retaliation in general) would have only been perfectly reasonable and understandable coming from those who presumed to call themselves scholars. Indeed, the only reason I agreed to meet was out of a genuine concern that my 130 undergraduate students would have had to suffer replacement instructors for the last few weeks of class—a reckless and destructive action I was convinced the powers-that-be were in spiteful enough of a mood to take.

“This stuff is out there,” I was told repeatedly, as if the deleted thread contained nuclear secrets that would eventually and inevitably fall into terrorist hands (ironic that those who profess an admiration for Edward Snowden or Julian Assange have a very different take on the free flow of opinion when it concerns far more mundane matters closer to home). My remarks, only briefly posted on the internet, had a life of their own, or so the reasoning went—a trope conveniently denying the willful agency of those who had, for whatever motives, consciously cut, pasted, and circulated those postings to colleagues, who had indeed “pushed” them even to those who were not customarily online nor otherwise paying any attention to my social media persona. To underscore the gravity of my predicament, I was reminded of certain policies prohibiting the use of university equipment and networks to circulate materials that could potentially be, among other things, defamatory or in violation of copyright—a complete irrelevancy since, as I said above, I had used neither university equipment nor networks to offer my remarks, but posted at home using my own personal computer with my own internet connection. On the other hand, the person or persons who had cut and pasted my copyrighted postings, probably with the goal of defaming me (it would not have been the first time such a thing had happened in my experience in this happy, collegial environment) and almost certainly by utilizing university equipment, networks, and email addresses to distribute them, had violated this particular policy on several counts—an irony no doubt completely lost on the powers-that-be.

In point of fact, my original remarks were no longer “out there” at all. Of my own volition, long before I was even aware that any colleague had seen them, I had deleted the thread from my social media page and expunged it entirely from the internet, although presumably the PDF still resides on several offline personal hard drives (I have in my possession only a blurred print-out passed along to me by a fourth or fifth party). Little of the contents of my original remarks bears repeating, least of all verbatim, and I am not going to do so now. Consequently, unless you participated in or happened to have seen the thread when it was live online, you will have to take my word for it when I characterize them as the stuff of typical profane griping of the sort commonly overheard in any after-working-hours bar, all but meaningless outside the context of the long-running private conversations in which they originated. In other words, it is material that could only cause harm if intentionally pirated by tattle-tales and magnified by malice, and then only to the extent to which any given remark may have hit a truthful nerve or two (in other words, if the shoe fits, wear it). On the other hand, if you only happen to have read the pirated PDF or learned of its contents through third parties, please be advised that you are culpable in an extremely tawdry conspiracy and paradoxically cannot admit any acquaintance at all with what I’m talking about—let alone feign outrage—without confessing to your own monumental lapse of ethics. In any case, you hardly count as my friend or colleague any more—but only you would know that, not I. Nota bene.

This Stasi-like behavior—the secret surveillance, informancy, scrutiny of furtively obtained materials, “telling mommy” and bidding her to take action, to say nothing of the subsequent shunning and other career repercussions I have had to endure since—needless to say, is in itself far worse than anything I could have asserted in my initial posted remarks or even have dreamt of alleging. The entire incident—from the oppressive environment prohibiting any form of criticism to the repressive actions taken as a consequence—speaks to the dysfunctional and poisonous culture that prompted their expression in an uncontrollable and unconstructive outburst in the first place. Although these actions do not retroactively affirm any of my individual complaints explicitly, they certainly do nothing to dispel them, and tend rather to generally confirm their validity—at least as topics that should be discussed and debated among fair-minded friends. In any case, it is utterly reprehensible behavior, especially coming from a group of individuals who presume to call themselves educators, and most of whom, individually if not collectively, I continue to hold in high esteem (including those who, it pains me to think, conferred upon me my degrees). It speaks volumes about the curious phenomenon in which erudite and enlightened individuals—in this case those of whom in their own scholarship, classroom teaching, and committee advising demonstrate openness, honesty, collaboration, and a willingness to transgress almost any boundary for the sake of critical inquiry—can devolve collectively into an expedient affiliation based on little more than careerism and self-seeking: autocratic, authoritarian, intolerant of dissent, demanding of absolute conformity at all costs, to say nothing of the blatant violations of university policy, principles of academic freedom, and simple human decency that are in abundant evidence here.

I alluded to this matter in another posting on this blog, and was reminded of it recently when I came across this commentary on the I Ching reading, number 13 (a fateful number for me), T’ung Jên/Fellowship with Men, in which the six in the second place reads, “Fellowship with men in the clan. Humiliation.” Of this, Wilhelm Baynes remarks,

There is a danger here of formation of a separate faction on the basis of personal and egotistic interests. Such factions, which are exclusive and, instead of welcoming all men, must condemn one group in order to unite the others originate from low motives and therefore lead in the course of time to humiliation.*

The “one group” in this case are the dissenters, non-conformists, and heretics who cannot keep their mouths shut, some of whom have been among our most valuable educators.

*Wilhelm Baynes, The I Ching or Book of Changes, trans. Cary F. Banes (Princeton University Press, 1950/1967), p. 57.