Showing posts with label education. Show all posts
Showing posts with label education. Show all posts

Saturday, August 20, 2022

The Three Rs: Rat, Race, Writing

In a 2000 interview, Doris Piserchia (1928-2021) makes a number of quite revealing statements about the life of a mid-century, mostly straight-to-paperback SF and fantasy author:

Saturday, November 14, 2020

Other People’s Plato’s Caves

The Allegory of the Cave is told in Plato’s Republic 514a–520a. Briefly, it says,

In a cave deep underground, a group of prisoners are chained to a bench in such a way that they face a wall. The only light from the tunnel leading up to the surface behind them projects murky shadows on the wall; these shadows the prisoners mistake for substance, reality.

Friday, October 16, 2020

Friday, September 18, 2020

J.K. Rowling, Aging Face of a Zealous and Growing Ignorance Movement

It’s a shame seeing J.K. Rowling become the face of a hate movement, and worse, an anti-intellectual movement of sycophants whose rhetorical tactics (fallacious reasoning) are on the level of snotty seven-year-olds. “Back up your assertions!” they squeal, while liking people with TERF in their handles.

Sunday, March 24, 2019

Too Secure for Words: Academia's Plain-Language Problem

I recently heard a news story on WESA-FM, the National Public Radio affiliate in Pittsburgh, on a program at the University of Pittsburgh on coding. It seems that some zillions and zillions of jobs are going unfilled nationwide, and some eight thousand in the Pittsburgh area alone. The story said that mid-career professionals who were contemplating a career change was the perfect applicant they were looking for the program.

Saturday, February 9, 2019

PCAM: 21st C. "Arts" .org Too Ashamed to Mention Drawing, Painting, or Sculpture by Name

If you want another sign of how completely debased the word "art" has become in our twenty-first century civilization (not to mention the intellectually corrosive effects of an MFA in the visual arts), herewith the Friday, February 8, 2019 email announcing a new local arts .org (note the words drawing, painting, and sculpture are completely absent):

Friday, February 8, 2019

Spectrum Disorder: Whither Drawing? Part 2

Another sign that drawing is withering away from our culture: The newly-rebranded Pittsburgh Center for Arts and Media, ostensibly a fine arts .org composed of the ashes of Pittsburgh Filmmakers and the Pittsburgh Center for the Arts, issued a press release today touting its "agenda for advancing excellence in film, digital video, photography and the spectrum of visual arts." Drawing, painting, and sculpture, once a mainstay of classes at the old PCA are never mentioned by name, presumably falling under the "spectrum" category.

Presumably, such quaint traditional arts too insignificant anymore to break out individually.

Update: read the entirety of their press release here

Full disclosure: I took all three kinds of classes and taught several cartooning workshops there myself over the decades.

Some latter-day student work from the Carnegie Museum of Art adult studio program, before 2014.

This demotion of actual art in favor of recording media follows the news of the closing of the Art Institute of Pittsburgh, flagship for a chain of design schools that abandoned traditional art in favor of digital animation and other newfangled media at the turn of the millennium. (I once attended and taught there as well.)

Less than a year ago, the Toonseum shuttered its downtown gallery location and entered what was described as a year-long "curtains drawn" hiatus. Whether it will ever draw anything again besides curtains remains to be seen. (I was shown there and participated in a drawing workshop.)

Less than six years ago, Pittsburgh's Carnegie Museum of Art discontinued its adult studio art classes, including drawing. (I taught several cartooning, drawing, and sketching workshops there.)

As I remarked on Facebook, Pittsburgh, once a haven of culture, is becoming a drawing desert.

More: Whither Drawing? Part I

Saturday, December 8, 2018

"Lets Me Do My Thing!": The Mystery of Alyssa G. and Her (Un)Broken English

It's been obvious for a long time that the internet and social media in particular has brought out every form of kook, conspiracy theorist, and beyond-the-fringe nutjob with their own idiotic take on the world. On my Facebook page, for example, fans are thrilled to have located the creator of Megaton Man, a comic book series they enjoyed as a teenager, but their very next post is how I'm a libtard for not caring about John Podesta's emails.

But I wasn't quite aware how far this mass insanity has spread until last week, when I came across one self-styled social commentator bold enough and ignorant enough to have made up his own grammatical rules to fit his conspiratorial world view, one in which evil corporations are not only taking over his personal Matrix but trying to staff fast-food restaurants with grammar-challenged immigrants.

What sparked his ire was a particular job recruitment poster he saw at McDonald's somewhere in the northeastern corridor of the United States. In it, a young girl, presumably of Latino ethnicity named "Alyssa G." and clearly enjoying her day off in a pink tanktop and blue yoga pants (and presumably listening to an NPR podcast on her device), declares, "Today my job let me do my thing."

"Today my job let me do my thing." It's called the past tense. But to Our Social Commentator, it's a conspiracy not only to recruit and exploit minorities, but also a sinister plot to spread "broken English." (Also Spanish, since this version of the poster is bilingual; so much for McDonald's secret scheme to appeal to Latinos, which Our Social Commentator is convinced he has single-handedly uncovered!)

Our social commentator created a video of this, with a very creative three-minute handheld shot of the poster affixed inside the glass door of a busy McDonald's. While we get seasick watching this cinema verite image, he reads the tagline from the poster over and over again, slowly, in a mock-Hispanic cadence, convincing himself that "my job LET me do my thing" is "broken English." Not only is McDonald's Corporation, in his view, intentionally appealing to Latinos from "down south" to come north and work for them for less than a liveable wage (and take away gainful employment from "legal" American citizens, as he repeatedly asserts), but they are also encouraging bad grammar.

Of course, "Today my job let me do my thing" is perfectly correct written English, grammatically speaking. It's called the past tense. "Today my boss let me have the day off; I went for a jog and listened to itunes; I did my thing rather than salt french fries or stand for eight hours at the take-out window. Today my job let me do my thing."

(One could, arguably, insert a comma after "Today." "Today, my job let me do my thing." But why quibble?)

Not only is Our Social Commentator an illiterate stooge, he's also an inept cameraman with no sense of artistic composition. Technology enables such mental mediocrities to seem reasonable and "part of the conversation."

Presumably her job didn't let her do her thing yesterday; maybe tomorrow it won't either. Maybe she'll have to go in to work tonight and perform oral sex on her (white male) boss to get the day off she wants next week (the comments on Our Social Commentator's video posting make even worse misogynistic, racist, and hateful remarks about "Alyssa G.," believe me). But today, her job let her do her thing.

Our social commentator, however, insists that his willful misreading of the phrase amounts to "broken English," and demands for the sake of Civilization that the word "let" be corrected with an "s" on the end, so as to read "lets." "Today my job lets me do my thing" would be his amended phrase.

However, "Today my job lets me do my thing" makes no grammatical sense whatsoever. In the simple present tense, which is what "lets" is, her job would have to let her do her thing every day, not just today. "My job lets me do my thing everyday." In fact, McDonald's already has a variation of this recruitment poster that reads, "My McJob lets me do my thing."

"My McJob lets me do my thing." Since the letting isn't confined to just today, it's also perfectly correct grammar. It's called the simple present tense.

Presumably this applies not only to today but everyday.

The only way "Today my job lets me do my thing" would make grammatical sense is if the person speaking were a senior citizen. "Fifty years ago, I had to work sixteen hours a day in a coal mine. But today my job lets me do my thing. That's because I'm basically retired and sit around all day sipping coffee in a McDonald's." In other words, "today" would have to mean "nowadays." And it is hard to imagine how a young woman going for a job on her day off would be using the word "today" in that sense.

McDonald's also has a recruitment poster with two other imaginary employees. "Join our team," it announces. One employee, a woman in a blazer, chirps, "Today my job got me promoted to general manager." A second, a hardworking student, says, "Today my job got me two credits closer to my degree." The third, our lovely Alyssa G., repeats her familiar tagline, "Today my job let me do my thing."

Which phrase is in broken English? That's a trick question, because all are perfectly grammatically correct. "Today my job past tense." Written communication never ceases to amaze!
Which of those phrases are "broken English"? Answer: none of them! They are all perfectly grammatically correct. It's called the past tense.

I commented on Our Social Commentator's handheld video clip. I wrote, "You're quite the grammarian. The phrase is perfectly correct as is."

His response was, "No it wasn't, asshole."

Now, a phrase is either grammatically correct or it isn't; it's not a question of is or wasn't.

Which leads me to think not only that Our Social Commentator (who is a self-professed Right-Wing bigot, I should mention, in case that wasn't already clear) is inventing "broken English" in commercial messages where none exists to suit his conspiratorial world view; he also seems to have a serious learning disability (possibly dyslexia), which prevents him from recognizing and distinguishing verb tenses in written English.

No doubt, McDonald's knows who they want to appeal to with their recruitment posters. And maybe they do want to staff their counters and drive-through windows with underpaid illegal immigrants just to fuck up my man Commentator's Extra Value Meal order. But I think it's safe to say that McDonald's Corporation, or its advertising creators, at least know the correct usage of present and past tenses.

In a world of ignoramuses with smart phones, subscriber channels, and silo thinking that is impervious even to objective Standard English usage, that is some reassurance at least.

Time was when hate-mongers, crazies, and other morons who shouldn't be let out on their own recognizance had to resort to cutting letters out of magazines (to compose ransom notes), or had to type out their ramblings (chain letters and other documents of their delusion) on portable typewriters, replete with misaligned text and worn-out ribbons. Such communication, on its face, looked amateurish; it was invalidated and dismissed by minds of average intelligence a priori.

Nowadays, slick technology comes with designer fonts, automatic alignments, and reasonably professional results, even if the operator doesn't know how to hold their smartphone still long enough to make their ignorant assertions. To discern the lies and insanity from legitimate communication requires of us, more than ever, critical thinking. That, and a sharp eye for detail. Luckily, the shitheads still give themselves away because the elements of basic grammar will always elude them.

"Leaves me alone and lets me do my thing!" Okay, pal.

Monday, November 19, 2018

Bleak House, Part II

I finally finished Bleak House - although much of the final third was rough slog. As warm, direct, and unassuming as I found Esther Summerson's first-person narrative, I found Dicken's objective, cynical, sardonic present-tense narrator at times impenetrable. The syntax was garbled, not so because of the present tense so much as Dickens trying too hard to be cynical and sardonic.

The only function of this narrator seemed to be to tell portions of the story that Esther herself could not have witnessed - also to remind the reader, heavy-handedly, that Dickens is after all a satirist. These passages would have been better had Dickens not tried so hard to overdo it. (This objective narrator is at his best at such times as when, late in the book, the steel manufacturer,  Rouncewell, converses with his long-lost trooper brother, George.)



Esther Summerson as narrator, for all her warmth, is every bit as penetrating and insightful - of such characters as Skimpole, Mrs. Jellyby, and Mr. Turveydrop - as is Dicken's presumably "objective" narrator, without the bite, and without seeming to be aware of her often sarcastic and critical transcriptions. The characterization of the seemingly roundabout but in fact relentless Columbo-like Inspector Bucket, for example, is completely consistent between the two narrators, offering no difference in point of view. Bleak House would have been a better book if told completely from Esther's generous (but not unflinching, as it turns out) perspective, rather than being shared with the intrusive and too-snarky "objective" narrator.

Still, the book finishes strong, and is quite moving, particularly in the reunion of the two brothers and Esther's corrected matrimony to the philanthropic Dr. Allan Woodcourt. In many respects, Bleak House is every bit as panoramic as Vanity Fair, albeit with a forced taciturn quality in the former that pulls in the negative direction as much as the latter pulls in a faux-comic upbeat direction.

Bleak House is not a novel to begin when I did (in 1985, at the age of 23), but it is a novel to read when you're almost 57. It is a middle-age novel, when one can appreciate the passing of time, look back with some objectivity over foolish life choices, and can appreciate the wisdom of experience.
_______
Vanity Fair and Bleak House, Part I.

Tuesday, July 3, 2018

Anatomical Archeology: Sketchbook Studies circa 1998

Here is a selection of anatomical studies I made in my sketchbook around 1998. I had been studying anatomy, principally the books of Burne Hogarth, at this point for some twenty years (1978), and would periodically brush up on anatomy every 18 months or couple of years or so. The studies here are unusually elaborate, and I recall at this point trying to "get it" once and for all (an unachievable goal!).
 


They are based on, in no particular order, Louise Gordon, How to Draw the Human Figure: An Anatomical Approach (Viking, 1979); Stephen Rogers Peck, Atlas of Human Anatomy for the Artist (Oxford, 1951); David K. Rubins, The Human Figure: An Anatomy for Artists (Viking, 1953); and Jack Hamm, Drawing the Head and Figure (Putnam, 1963). There may also be a stray bit of Andrew Loomis, culled from a Walter T. Foster edition, but I don't think so.



The fact that I was no longer interested in Burne Hogarth's Dynamic Anatomy or Dynamic Figure Drawing (Watson Guptill, 1958 and 1970, respectively) suggests I was looking for more realistic instruction than the kind that fueled the superhero studies of my teenage years (and which I tried to rid myself of in the early, exaggerated Megaton Man comics). But the oldies are the goodies, and nearly all of the titles here are old chestnuts in the genre.



As I used to tell my workshop and drawing class students, while software programs become outdated every three weeks, human anatomy hasn't changed significantly in tens of thousands of years; there is no better investment of time and effort than studying the subject.



Many newfangled anatomy books, especially the execrable Human Anatomy Made Amazingly Easy, by that cynical charlatan Christopher Hart, are worse than no instruction at all. Stick to the classics.

Thursday, October 26, 2017

Whither Drawing?

Here's what the 2016-2017 Handbook of the National Association of Schools of Art and Design has to say about a Bachelor of Fine Arts Degree in drawing (p. 103-104):
a. Understanding of basic design principles, concepts, media, and formats. The ability to place organization of design elements and the effective use of drawing media at the service of producing a specific aesthetic intent and a conceptual position. The development of solutions to aesthetic and design problems should continue throughout the degree program.
b. Understanding of the possibilities and limitations of the drawing medium.
c. Knowledge and skills in the use of basic tools and techniques sufficient to work from concept to finished product. This includes mastery of the traditional technical and conceptual approaches to drawing.
d. Functional knowledge of the history of drawing.
e. Extensive exploration of the many possibilities for innovative imagery and the manipulation of techniques available to the draftsman.
f. The completion of a final project related to the exhibition of original work.
Note that there is no mention of human anatomy, figure drawing, or manual perspective drawing (although computer-aided perspective is an advised competency).

From "Teaching Cartooning" in Streetwise (Two Morrows, 2000).

Here's what the handbook says about computers in general (p. 101):
Digital Media. The Bachelor of Fine Arts is appropriate as the undergraduate degree in which digital technology serves as the primary tool, medium, or environment for visual work. Titles of majors for these degrees include, but are not limited to: digital media, media arts, media design, multimedia, computer arts, digital arts, digital design, interactive design, Web design, and computer animation.
No mention of mastery of traditional fundamental drawing principals, and digital technology is the "primary tool."

This is why I am a self-taught figurative artist, and why I advise students to make the most of their college tuition pursuing a well-rounded "book-learning" liberal arts curriculum (English, languages, history, philosophy, sociology, etc.), and skip the BFA.

Art school in the broadest sense only makes sense for a profession that requires actual accreditation, such as architecture or interior design.

See also: The Withering Away of Drawing

Wednesday, August 26, 2015

Chain Culture: The Loss of Borders and the End of a World


When the Borders brothers sold their budding bookstore chain, the company was well known for its impeccable customer service, top-notch inventory system and large-format approach that uprooted the way the books were sold.

But the Borders shopping experience eroded over the years as the chain grew in size, management became unwieldy, the Internet encroached on sales and electronic books emerged as an alternative for avid book readers.[1]
A number of reasons have been given as to why Borders, a used bookstore founded in Ann Arbor in 1971 that became a retail chain in 1992, ended in bankruptcy in 2011. Among the most prevalent are: the rise of the ebook, competition with Amazon, overexpansion of retail locations, overinvestment in music sales, and various mismanagement decisions. Slate.com quipped, “It died by a thousand—OK, maybe just four or five—self-inflicted paper cuts.”[2]

But Nathan Bomey is right when he places the erosion of the Borders shopping experience at the head of the list.

A shopping experience may be a more difficult thing to quantify than the ubiquitous assertion of mismanagement, but it is very real. In the case of Borders, the erosion of the shopping experience was deadly.

I grew up in suburban Detroit in the 1970s, about 40 minutes from Ann Arbor. Two youth counselors at my church had been students at the University of Michigan, and were well acquainted with the first Borders Store on State Street, and took us there on an expedition. This was not its very first location, but it was already a fully mature destination of wonder. Large, with brick walls and multiple levels, it seemed to have every coffee table art book under the sun, scholarly titles, mystical new age books, books on world cinema, and cultural journals. I never had any money in those days, but in the early 80s, when me and my friends haunted the art film houses ensconced all over campus, Borders was a place to explore before or between screenings. (Undoubtedly, the mystique of Borders influenced the naming of 1980s science fiction comic book saga Border Worlds.)

When store #9 appeared in the South Hills of Pittsburgh in the early 90s, I did have money, and I spent a lot of it there. I can’t remember if I saw the store logo driving past, or heard about it from a friend, but as soon as I learned that a Borders store had opened, I realized that the world had become a better place. It was not as great as the Ann Arbor location, but it was still a destination and a treasure house. I spent many a rainy Saturday night there, sipping coffee and coming home with Neil Forsyth’s The Old Enemy: Satan and the Combat Myth, or Joseph Campbell, or many a coffee table book that I still have in my library.

When store #174 open in the North Hills, it was not as great as store #9, it was still good. From 2000 to 2005, I worked there part time on and off. It was there that I was inspired to go back to school, finally earning my PhD in art history in 2013. This was during the heyday of Harry Potter and Chicken Soup, and one of my own freelance illustration jobs, for Al Franken’s Lies and the Lying Liars Who Tell Them: A Fair and Balanced Look at the Right appeared. It was only slightly absurd that the book for which I had drawn The Adventures of Supply Side Jesus was one of the innumerable items I rang up as a cashier, or helped people to locate as a bookseller. (No, I never mentioned that, by the way, I was the cartoonist!)

But I was not that unusual in having an example of my work on sale at Borders. A number of the staff were highly creative, particularly in music but also in theater. The manager recorded a smooth country album produced by another employee that played on the store sound system for several weeks, and other employees often had publications and creative offerings of one sort or another featured in the store.

Life without Culture: Undoubtedly, the mystique of Borders influenced the naming of 1980s science fiction comic book saga Border Worlds. An unpublished panel.
But during my time at Borders, the shopping or customer experience did erode noticeably, along with the employee experience, at the end of my time there quite precipitously. At the beginning, each store had its own CRC or Community Relations Coordinator, a person responsible for scheduling events such as folk singers in the café, local author signings, or weekly or monthly meetings of the poetry group; it had a rack of free brochures and local independent newsweeklies; a plethora of scholarly titles; and still a wide selection of off-beat magazines. Most importantly, it had knowledgeable employees who cared about culture in its manifold forms.

But quickly the CRCs were replaced by regional staffers overseeing multiple stores, and finally event planners in the corporate headquarters. The quirky folk singers were routed out, and events were stripped down to a few big-label music releases. Author signings followed suit, with local authors eliminated for fewer, bigger national names. Groups that were once given coupons for free cups of coffee and announced over the store sound system were quietly eliminated. The number of sofas and chairs strewn about the store for customers were eliminated, as well as (maliciously) the stools for employees manning the service desk. The brochure rack disappeared.

None of these clunky, handmade aspects of Borders were profit centers in and of themselves, and many of them were inefficient and bothersome to employees. I personally found the local iteration of the Socrates Café, a meeting of overly loud bullshitters named after the book, extremely fatuous. But they all contributed to the atmosphere of Borders as a unique, even sometimes bizarre experience, and their loss contributed to the erosion of the shopping experience and, guess what, the bottom line.

A word about those knowledgeable employees: a typical Borders bookseller was college educated, perhaps changed majors too many times to complete a degree, maybe had even dropped out of grad school, or was by temperament or otherwise unsuited either to academia or the corporate business world. For these sensitive souls, work at a chain bookstore at slightly above minimum wage might not have amounted to a career, but it allowed them to utilize their minds and earn an employee discount, and to be among some of the rich cultural resources that they loved.

Such a labor pool certainly existed in Ann Arbor in the 1970s, and nearly every major city and college town into which the Borders chain initially expanded had a ready supply of such employees. In more than one way, the growth of the chain eventually outstripped this labor pool, and by the early 2000s (myself notwithstanding), such knowledgeable, geeky, cultured, and book-loving employees were in increasing short supply. (College, it seemed, had become too expensive for humanities majors, or at least for humanities majors to drop out before completing their degrees and getting a real job to pay back their student loans.) New employees could have been working in any kind of retail or fast food business, and manifestly could not have cared less about books or culture. Indeed, many of the older, knowledgeable employees of the type that built the Borders brand were consciously being routed out by management as the 2000s wore on, along with the free weekly newspapers, the quirky folk singers, and the pompous poetry groups.

While ringing up a Schaum’s Algebra workbook in 2002, I had a serendipitous (serendipity being one of my church youth counselors’ favorite words) moment, and realized I should go back to college. I started part-time in January 2003 at the Community College of Allegheny County, and was full-time by the fall. I earned 60 gen ed transfer credits and started at Pitt in 2005. During this time I phased out my part-time employment at Borders, which finally concluded with the end of the 2005 Christmas season (a notoriously bullying manager that had been transferred to our store was summarily fired after the holidays). By this time, the chain had already cultivated a corporate feel virtually indistinguishable from Barnes & Noble.

It is important to note that even as store stock contracted and the notorious Categories scheme was implemented (turning the de facto control of entire genres over to the highest-bidding publishers), it was still useful to work part-time at Borders even and especially as I returned to school full-time. Familiar with the ordering system, I could make SPOs (special purchase orders) of virtually any title in print and quite a few out of print (particularly those I needed for school), usually at the highest employee discount rate, and virtually risk-free, making it more convenient than Amazon. At some point, however, working at Borders became not worth it, and ordering through Amazon became the preferred mode of acquiring necessary books during grad school.

I still occasionally shopped there, but my own shopping experience was noticeably less enjoyable than in the past. Selection was curtailed, bland bestsellers dominated, games and gifts replaced scholarly titles, and it became easier to order books for school online. It was no longer a destination or a treasure house, but a cold, unfeeling, alienating experience.

The shopping experience had eroded over the years. Was nobody watching?

I still miss Borders every rainy Saturday night, like one sometimes yearns for a bygone lover.

Friday, October 3, 2014

Fun With Texture: Demo from a Cartooning Workshop

This sheet was drawn on Strathmore medium drawing 400 series 9" x 12" creme paper as a demonstration for a cartooning sketchbook workshop at the Carnegie Museum of Art in 2008. I enjoyed those workshops immensely. They were usually held in summer, although in recent years I became too busy with graduate school to be able to offer them. For years the museum refused to offer cartooning instruction, insisting by policy that educational offerings coincide with works on view in the museum galleries. Finally, in 2004, with the R. Crumb retrospective as part of the Carnegie International that year, I was invited to give instruction.

 
Since then the museum has canceled adult education workshops in drawing, painting, ceramics and other traditional media in favor of lectures relating to contemporary works of art. It is nothing short of tragic to see the museum art world forsake interactive drawing, the basis of all the visual arts (including architecture, cinematic storytelling/storyboarding, theatrical set and costume design, etc.) for passive dispensation of theory. The proper response to art is artmaking, not idle attendance at a lecture.

Two CMAs and the Second Commandment: A Digression

The current artworld, centered in public museums housed in large, monumental neoclassical buildings, have run the risk of succumbing to an ideology centered on their own self-importance as elite palaces of culture rather than democratic institutions of municipal and civic engagement. Cleveland's museum early in its history built a palace but emphasized education for all classes of Clevelanders, and despite the impulse to move to the right, has managed to successfully balance the two; but Pittsburgh, unfortunately, has not. Under its current leadership, Pittsburgh's CMA (as opposed to Cleveland's CMA) has embraced the ideology of contemporaneity in which various pseudo-Dada practices form the basis of high-flown intellectual discourse. But such mere pseudo-political conversations as can result from the contemplation of found objects, installations, performance and the like, while often interesting and verbally challenging, are rarely as rich as the contemplation of visual art that are works of the mind, as manually-generated images almost by the very means of their origins almost inherently are.

The mistake that over-educated, verbally-adept critics, curators, theorists, and art historians continually make is to disregard visual composition such as only the hand produces as thoughtless, or at least not as thinking on a level comparable with words. Old-fashioned craft, according to this ideology, is reserved only for the wordsmith and never the image maker, who is invariably regarded as a capitalist sell-out for rendering illusions corresponding to apparent reality, or at the very least mechanical and uncritical like a camera. Likewise, such honorifics as thinker and genius are reserved for the writer of texts, and even the title artist, when bestowed upon maker of conversation pieces, is not done without the most arch and patronizing irony. The bias for text over image runs very deep in our culture, going back at least to the Judao-Christian second commandment, which Max Horkheimer claimed as the basis and justification for contemporary critical theory.*

In any case, one hopes that the ascendance of logos and the iconoclastic impulse that has subtended much enthusiasm for modern and contemporary art over the past century or more will prove to be only a temporary aberration in our culture, and for a return of drawing to the educational environment of the city of Pittsburgh, and to the artworld nationally and internationally, in the very near future.

*See Max Horheimer, letter to Otto O. Herz, September 1, 1969, in Gesammelte Schriften, volume 18 Briefwechsel 1949-1973 (Frankfurt am Main: Fischer, 1996) p. 743; cited in Sven Lüttken, "Monotheism à la Mode," in Alexander Dumbadze and Suzanne Hudson, Contemporary Art: 1989 to the Present (Wiley-Blackwell, 2013), pp. 304, 310, note 11. Lüttken attempts to make the rather unconvincing argument that a total ban on representative art is a valid form of criticism of the image and the proper role critical inquiry, suggesting the temperament of critical theorists.

For more on drawing, see The Withering Away of Drawing. For more on the Dumbadze anthology, see After Critical Thinking.

Monday, July 7, 2014

The Fig Leaf of Cognitive Training: Navigating Our Mediated World by Conforming to Contemporaneity

In response to “The Crisis in Art History,” which I cited in my previous post, Amy K. Hamlin and Karen J. Leader, in “Art History That! A Manifesto for the Future of a Discipline,” characterize art historians as having “Highly developed visual discernment, a deep knowledge of history, [and] a nuanced understanding of cultural heritage.” They assert, “we need art historians because they are equipped to teach the skills urgently required by twenty-first century citizens to navigate the complexities of a visually driven information age.” The last point, that the study of art history is vital to navigate our mediated world, is related to the rationale I heard in my old department: that art history, unique among college offerings, uniquely offers an irreplaceable training in visual analysis. Any criticism of the curriculum, pedagogy, or methodologies currently trending in art history, for Hamlin and Leader, are attacks on the humanistic development of critical cognitive faculties; the real problem, as they see it, is rather the exorbitant expense acquiring a college education. [1]

This is a disingenuous argument for two reasons. First, while art historians are equipped to teach some of the skills required for citizenship in the twenty-first century, they are neither uniquely equipped or even the first choice for doing so. If we seriously want to prepare individuals to navigate modern visual media, if not inoculate them to the more subtle forms of visual manipulation deployed by advertising, political campaigns, and visceral entertainment, what is called for would be a practical and theoretical course in film editing and theory, and it would be mandatory in every undergraduate curriculum.

For starters, the Odessa Steps sequence from The Battleship Potemkin would be analyzed frame by frame (preferably in an old Moviola), and students would have the opportunity to edit their own footage (and tell their own truths or fabrications) in Adobe Premiere. The theoretical writings of Vsevolod Pudovkin and Sergei Eisenstein would be read and debated (preferably vehemently, in a café), and Indiana Jones and the Temple of Doom, one of the most brilliantly edited movies I can think of, would be dissected for its fluid technical mastery and somewhat crude cultural and ideological assumptions. Static images would be studied as well, particularly in their juxtaposition as storyboards or comic strips. (I am, of course, describing my own early training in the study of comic book storytelling, except that I viewed the Odessa Steps on an 8mm film viewer, and spliced together a few shots together on 16mm with pieces of adhesive tape, the old fashioned way.) Still life and figure drawing would be optional but strongly encouraged, as well as basics of photography (composition, lighting).

To this course of study, art history could perhaps be offered as an ancillary curriculum for those wishing to explore the ways in which manual images were made prior to the advent and inexorable conquest of photography and cinematography over the past 175 years, as well as studio courses for those wishing to master manual image making (figurative drawing, painting, sculpture) for themselves (filmmakers such as Eisenstein and Fellini, and art theorists such as Meyer Schapiro were all skillful and gifted artists in their own right, practices that informed their work).

Donald Simpson (American, b. 1961), Light Up! (With Apologies to Tony Smith), 1979. Photo-mechanical transfer on photo paper from found clip art, 8 1/2" x 11". Collection of the artist. © Donald Simpson, all rights reserved.


To another of Hamlin and Leader’s points, that art historians having a “Highly developed visual discernment, a deep knowledge of history, [and] a nuanced understanding of cultural heritage,” Patricia Mainardi and Pepe Karmel, in “The Crisis in Art History,” already dispute that. With the surge of contemporary art study threatening to overtake that of “historical art” (i.e., precontemporary art), the authors see an increasing neglect of historical study and a cheapening of the art history curriculum. Mainardi laments, “the vast amounts of wealth now moving through the world of contemporary art, in museums and auction houses, galleries, and international art fairs” are seducing art history students away from “the libraries and archives of previous generations.” She notes, “Wherever contemporary art studies have become dominant, the same results are apparent.” Students no longer study “the art of different periods and cultures,” but instead focus on the art of the twenty-first century, and almost exclusively on texts written in English.[2] Karmel notes that the average time to complete a dissertation in art history overall is 4.2 years; for premodern topics, the average is 5.5 to 6.3 years; modern art, 3.9 years; but for contemporary only 2.6 years. Karmel remarks,
You interview the artist a few times, you persuade the artist’s gallery to let you see their files and their photo archive (the real-world equivalent of a catalogue raisonné), you read the published criticism, you follow up on the artist’s remarks about texts and ideas that influenced him or her. Then you sit down and write. The resulting text may be very good. It may become a terrific book or exhibition catalog. But it simply is not the same thing as a PhD dissertation in other fields of art history. And the degree it earns should not be a PhD.[3]
What Hamlin and Leader’s (and my old department’s) defense of art history ignores is the overwhelming expenditure of energy, not on training students to navigate our mediated world or even to visually analyze right-wing print propaganda, but on genuflecting before all-powerful art world institutions (including the academic discipline of art history itself). Why would one-of-a-kind treasured works roped off in a museum, or a Jeff Koons guarded by bouncers at London Frieze,[4] best serve as examples of visual phenomena for such study anyway? Art history involves all kinds of fascinating side trips into aesthetic theory, the chemical analysis of pigments, and internecine doctrinal fights, but very little of this is of any practical use to the college student trying to make critical sense of the media barrage emanating from her smart phone. As I said, justifying art history on such grounds is tantamount to advocating the study of the history of world religions as the surest remedy for high blood pressure since we all need a quiet, meditative break from the frenzy of our lives now and then. It is absurd.

Far from a useful training in the navigation of our highly mediated world, art history is currently little more than an indoctrination into the current world of art. It is crucial to make this explicit as the wealth of that art world increasingly seduces and obtains a stranglehold on academic programs, away from the study of what Mainardi calls “historical art” to contemporary product, of which we are urged to “think historically” as Terry Smith puts in is ubiquitous writings on contemporary art.[5] Of course, it is not impossible to consider the present from an historical perspective. Indeed, an historic sensibility is desirable; hence the study of history. However, rendering pseudo-art historical judgments on what is valuable in our present visual culture, judgments that are immediately ratified and reified by institutions with the power, authority, and economic clout to makes such choices forever fixed and unalterable by later generations (by the inclusion of certain works in public exhibitions if not permanent collections and in textbooks) is not a historical process at all. Such complicity in contemporaneity is not a critical function but corruption itself. It is scholarship shilling for the current art world.

From this view, the promise of cognitive training can only serve as a cynical fig leaf to what is really going on in art history programs today: the spread of conformity and complicity in the pseudo-cultural machinations of capital.

Notes
[1] Amy K. Hamlin and Karen J. Leader, “Art History That! A Manifesto for the Future of a Discipline,” Visual Resources: An International Journal of Documentation, vol. 30, no. 2, pp. 138-144; quote p. 139.
[2] Patricia Mainardi, “Art History: “Research that ‘Matters’”? (pp. 305-307) in Patricia Mainardi, “The Crisis in Art History,” Visual Resources: An International Journal of Documentation, vol. 27, no. 4 (December 2011), pp. 303-343; quote p. 306.
[3] Pepe Karmel, “Just What Is It That Makes Contemporary Art So Different, So Appealing?” (pp. 318-327) in Patricia Mainardi, “The Crisis in Art History,” Visual Resources: An International Journal of Documentation, vol. 27, no. 4 (December 2011), pp. 303-343; quote p. 326.
[4] See A.A. Gill, “Frieze Until the Numbness Sets In,” Vanity Fair, January 2014, pp. 44-45; p. 45.
[5] Terry Smith, “Contemporaneity in the History of Art: A Clark Workshop 2009, Summaries of Papers and Notes on Discussions,” Contemporaneity: Historical Presence in Visual Culture [http://contemporaneity.pitt.edu], vol 1 (2011), p. 13.

Saturday, July 5, 2014

Dead End: Why Art History is No Longer (and Perhaps Never Was) an Academic Discipline

When I decided to stop my adult life and return to college, I did so for many reasons, including the desire to widen my career options and a more general desire to satisfy my intellectual curiosity. I chose the history of art and architecture as my undergraduate major and subsequent area of graduate research specifically to answer a number of immediate questions I harbored as a life-long artist. On social media not too long ago, I remarked that I deeply regretted that decision from a professional as well as intellectual standpoint, for, while I had diligently finished what I started and despite having satisfied my intellectual curiosity in regard to particular questions pertaining to art early on, I found the discipline dogmatic, constrained, and for all practical purposes exhausted (art history moreso than architectural history, although for all intents and purposes my dissertation was in urban planning history), and wished that I had taken a more general course of study such as history, urban policy and planning studies, languages and literatures, or philosophy. The grass may not be any greener in those disciplines, but after eleven years of mid-life college, I felt entitled to indulge briefly in a bit of buyer’s remorse (although one of my advisors regarded my remarks as a personal and permanent betrayal); in any case the lawns certainly seem wider. I am of the growing conviction that art history as an academic discipline is a completely exhausted field of study that for all intents and purposes could be hermetically sealed, requiring only a few caretakers to tend to the classified, archived, and salted away extant body of knowledge.

In one interdisciplinary conference I attended just prior to receiving my doctorate, it was claimed that what made art history an indispensable academic discipline was its unique emphasis on “visual analysis,” learning to describe objects verbally, and elicit increasingly probing questions about their facture and purpose, characterized as a vital and necessary skill in our increasingly mediated world. This I thought rather weak tea, a feeble rationale accounting for only a miniscule portion of the skills demanded by the field, and hardly a convincing justification for the immense energies expended on grasping theories, styles, and archives of key works a (what used to be called canons). Besides, the same skills can be acquired in English 102 by describing a dried leaf, to say nothing of film or media studies. It is analogous to claiming that the study of the history of world religions is justified because after all we all can all use a quiet moment of prayerful meditation now and then in this stressful, frenzied world. In other words, on that score, there is nothing about the study of art history that is not shared by many other academic disciplines.

Art history, as I have found it to be practiced, is a narrow and constricted discipline, an academic ghetto. Art history is to history, to paraphrase Mark Twain, what a lightning bug is to lightning. Although there are many histories and manifold interpretations thereof, there is only one art history, dogmatically dispensed and disciplinarily policed. The general narrative proceeds teleologically from the Venus of Willendorf and the caves of Lascaux to the Acropolis and the Sistine ceiling, ending with elephant dung paintings and a shark in a tank of formaldehyde. It is a story that, to say the least, does not have a happy ending.

Donald Simpson (American, b. 1961), Still Life with Bust of Venus de Milo from Pier One Imports, 2007. Charcoal on paper, 18" x 22". Collection of the artist.

Miss Helen Clay Frick, the American art historian and founder of the department in which I earned by BA, MA, and PhD in history of art and architecture, believed art history to have come to an end by the Civil War, or at least that there had been little art produced since that time worth serious study. The mid-nineteenth century date for the end of art history is not far from a general consensus among figures as diverse as Hegel and Hans Sedlmeyer, whom, for different reasons, regarded art as having disappeared some time since the middle ages. But one need not be an avowed anti-modernist like Miss Frick; the consensus even of modernists is that a certain tradition of art came to an end in the nineteenth century with the dawn of modernism. This admittedly loose periodization comports with a general view that in the nineteenth century, with the advent of the public art museum and the formation of the academic discipline of art history, art history itself, paradoxically, had come to an end.

Art, as it were, having become historically conscious of itself by collecting and studying its own past, at the same was inherently incapable of collecting and judging the art of the present, or of adding any of it to art history, at least with the same authority as the unfolding of history itself. To claim certain works of living artists as of historical importance without the passage of time as proof of enduring value, it was clear, would have been to pick winners and losers with the imprimatur of history, a de facto illegitimate procedure. Museums and art history therefore created “museums of living artists,” quarantined holding tanks for new art, so that this new work could be viewed and appreciated by the public but also so that the final verdict of art history (inclusion in the encyclopedic museum) could be postponed at least until an artist’s death. Until at least that much time had passed, the jury was considered still out.[1]

This self-imposed restriction on including living artists in the newly-forming encyclopedic art museum was short-lived, lasting little more than a generation or two; but it was sufficient time to create a permanent rupture in art history. Paradoxically, anti-modernist collectors such as Henry Clay Frick, his daughter Helen Clay Frick, Samuel P. Huntington, and J. Pierpont Morgan, whose private collections became the founding permanent collections in a network of museums in the United States, essentially starved out a generation of traditional figurative painters by denying them commissions while sinking millions into Old Masters, antiques, and rare manuscripts. This perverse neglect, along with the inexorable conquest of photography, removed any incentive for artists to master such representational skills as perspective and anatomy. John White Alexander, a painter once as famous as Sargent, spent the final 15 years of his life on a mural that, while on public view to this day, languishes in art historical obscurity: The Apotheosis of Pittsburgh, for the Carnegie Institute in Pittsburgh. For this monumental masterpiece, Alexander received $175,000, a large sum for a living artist, to be sure, but a mere pittance approximating the amount a Frick or a Morgan habitually spent on an Houdon bust or a Gobelin-manufactured piece of Louis XIV furniture, which they in fact bought by the carload, sometimes from each other.[2] Artists with representational skills and inclinations naturally migrated to where the money was: illustration and other forms of commercial art; galleries that served collectors and the museums they fed were surrendered to the avant-garde. With friends of traditional art like Frick and Morgan, who needed enemies? Anti-modernism produced its opposite, modernism, almost without any help. In any case, academic art history faced a choice: follow classically-trained figurative artists into new media (print) with its alien editorial-advertising model of patronage, or remain with the easel painters and the familiar system of galleries and collectors (and ultimately museum patrons). And the changing stylistic tastes? Progressivism.

By the mid-twentieth century, museums had gotten over their reluctance and began adding works of art from the late nineteenth century to the present into their collections and narratives, only somewhat belatedly incorporating modernism into art history. This work was all of a certain ideological character, namely socially progressive in terms of content or avant garde in terms of form, and its inclusion was strictly on the basis of adding something to the constructed narrative of art history that had never been seen before, like adding newly discovered atoms to the table of elements. What is significant about this move, as Boris Groys has pointed out, is not so much what was added to the art historical narrative but what was excluded: works that were deemed visually repetitive in that they carried on traditional representational practices and/or were created for a popular audience or served a commercial purpose. Groys calls this the “museum taboo”: after the mid-nineteenth formation of the museum, new additions to art history could not look like anything that had come before. A new work had to be something apparently different, novel, that in some way expanded the meaning of, or our understanding of, art. Art history is thus reduced to a chronology of scientific breakthroughs, or as Gabriel Tarde put it, “history above all is a record of inventions.”[3] But this is to make of art history nothing more than an inventory of neologisms compiled in a dictionary quite for their own sake, regardless of whether or not anyone ever puts these new expressions or idioms to practical use. Indeed, as Groys formulates it (although he does necessarily endorse this view), the museum taboo virtually prohibits artists from adopting the language or style of another artist.[4] Once something has been invented, like penicillin, the invention cannot be repeated. But while penicillin is still manufactured and in use, who can possibly drip after Pollock?

If such a taboo of apparent repetition, of art not being allowed to “look” like previous art, were extended back in time, it would eliminate much of the art prior to modernism. The Renaissance, the Baroque, and neoclassicism, to say nothing of the classicizing tendency of later Bauhaus architecture, are all reiterations or reinterpretations of classical antiquity that on some level “look” like ancient art. To eliminate works such as Soufflot’s Ste. Geneviève, a work that consciously tried to “look” Greco-Roman, from the canon of works deemed worthy of art historical study would be to ignore how Soufflot sought to outdo the Greeks and Romans in terms of structural engineering and scale. In other words, there is always more going on in art than meets the eye, and the exclusion of decades of representational work from art history on the grounds that it “looks” like the art of the past is more than an irrational taboo; it is intellectual laziness.

But this is only one instance of the double-standard that pertains to art created since the advent of art history. Another example would be the grounds upon which commercial illustration has traditionally been excluded from the museum: because it was not created with the gallery wall in mind, but rather for reproduction on a printing press, and not for the delectation of an elite audience, but a broad public.[5] This denies the fact that Norman Rockwell, trained as a painter along with legitimate “gallery” artists of his generation and an assiduous museum-goer, certainly was acutely aware of the gallery wall at his easel, whether he was painting a Saturday Evening Post cover or a coffee advertisement, and nursed a barely-concealed ambition that at least some portion of his work would one day grace the gallery wall, while a great many artists of the past, such as icon painters, never had the least intention that their sacred works would ever be exhibited as purely aesthetic objects in a profanely secular space. Indeed, most modern and contemporary work that has been included in art history has the distinction of having been explicitly intended for the gallery wall. If works not so intended were to be expelled from art history, major museums around the world would have to deaccession much of their holdings and sit emptied and bereft of sizeable portions of their permanent collections.

There is no greater divide in art history than that marked by the rise of art history itself. Premodern art, the only kind thought valuable by Miss Frick, is held to an altogether different set of standards than art since the late nineteenth century, the kind of work that is implicitly subject to Groys’ museum taboo. From this view, premodern or what might be termed precritical art forms a sort of primordial unconscious to the more acutely self-conscious modern and contemporary period. Modern and contemporary art is nothing if not conscious (and critical) of itself and previous art history, positioning itself against the past or freely (and usually without a trace of cleverness) appropriating it. Anti-modernists like Frick saw art history coming to an end with the advent of modernism, while modern and contemporary theorists see art history beginning with the same moment of rupture.

For Arthur C. Danto, modernism and contemporaneity are the two eras surrounding “the end of art,” distinguished by their attitude toward the premodern art of the past. For Danto, modernism is characterized by celebrate “a repudiation of the art of the past,” while “Contemporary art, by contrast, has no brief against the art of the past, no sense that the past is something from which liberation must be won, no sense even that it is at all different as art from modern art generally. It is part of what defines contemporary art that the art of the past is available for such use as artists care to give it.”[6] To use a theological metaphor, modernism is the Old Testament and contemporaneity is the New Testament in a new dispensation that has transformed the current art history Master Narrative. Within this dogma slight denominational and doctrinal differences can occur, but no great deviations of dissent or heresy.

To return to the example of commercial illustration, art that persists in “looking” like the art of the past, e.g., representational art, is relegated to Visual Culture Studies, and art historians who choose a topic like the posters of Alphonse Mucha are not so much permitted to pursue such research as discouraged to pursue it, in that they are encumbered by the additional superfluous methodologies pertaining to visual culture. That art historians cannot simply consider such material as a part of art history with methodologies acquired by the study of premodern art demonstrates how the boundaries of the discipline are so thoroughly ideologically policed.[7]

The problem is not what is included in art history (the elephant dung paintings, the shark in the tank of formaldehyde) so much as what is excluded: mountains of creative visual material that do not suit a preordained set of ideological assumptions and scholarly methodologies. As Raymond Williams writes,
There is great danger in the assumption that art serves only on the frontiers of knowledge. It serves on those frontiers, particularly in disturbed and rapidly changing societies. Yet it serves, also, at the very center of societies. It is often through the art that the society expresses its sense of being a society. The artist, in this case, is not the lonely explorer, but the voice of his community. Even in our own complex society, certain artists seem near the center of common experience while others seem out on the frontiers, and it would be wrong to assume that this difference is the difference between ‘mediocre art’ and ‘great art.’
For Williams, the notion that “ creative’ equals ‘new’ […] is a really disabling idea, in that it forces the exclusion of a large amount of art which it is clearly our business to understand.”[8]

In such a case, as someone once said (I think it was Mark Kingwell), art history becomes little more than a chronological listing of works whose sole interest lies in the fact that at one time they were considered authentically modern. One has to explain to the undergraduate that Duchamps’ urinal or Joseph Kosuth’s One and Three Chairs was once cutting edge, and Andrew Loomis was not an “artist” because his work was published in magazines, all of which is dutifully noted without question (other than, “Is this going to be on the test?”).

A substantial survey in the journal Visual Resources entitled “The Crisis in Art History,” paints an even more dire picture. Patricia Mainardi points out “the skewing of academic art history and museum exhibitions toward contemporary and away from historical art,” and that “The tail of contemporary art is now wagging the dog of art history,” resulting in a neglect of archival research in favor of superficial gallery-hopping. Pepe Karmel reports that “Quietly but rapidly, there has been a broad loss of interest in older art—meaning art made before 1980.” Both Mainardi and Karmel note the huge amounts of money and students gravitating toward the study of contemporary art, to the neglect of “historical art” and its methodologies, resulting in the loss of a sense of history as well as tenured positions in pre-contemporary art, and a general dilution and cheapening, if not dumbing down, of the discipline. Still, Karmel sees little choice but for the discipline to increasingly serve this growing market.
It seems likely that, in years to come, there will be more and more money available for the study of contemporary art, and less and less for the study of everything else,” but if art history does not service this market “another department will.” Still, Karmel argues forcefully that the study of contemporary art should be relegated to a certificate in which “There would be a capstone project requiring research and writing on a particular artist or movement. However, this would not be a doctoral dissertation, and the resulting degree would be a certificate in contemporary art, not a PhD. Such a degree would not qualify graduates to teach at a university level.[9]
But even if the field of art history were to eschew contemporary art for a return to hardcore “historical art,” it begs the question as to how it can be justified as an autonomous academic discipline.

In his early (1939) essay “The History of Art as a Humanistic Discipline,” Erwin Panofsky describes the work of the art historian as a fusion of rational archeology and sensitive connoisseurship. Or as he puts it, “the art historian subjects his ‘material’ to a rational archaeological analysis […] but he constitutes his ‘material’ by means of an intuitive aesthetic re-creation,” rather like “a loquacious connoisseur.” In other words, the art historian “constitutes his object” of art historical study by first recognizing in it its “demand to be experienced aesthetically.”[10] (Panofsky recognized that every art historian may be limited in terms of aesthetic sensitivity by experience and “cultural equipment,” but that this can be broadened by erudition.) As an ideal, this is perfectly plausible—although no example is given to illustrate, one imagines an art historian going out into the world, digging up a find from a tomb or rummaging through an attic, and recognizing a work of art worth that both arrests her aesthetic sensibility and demands further documentary investigation.

However, even in 1939, when Panofsky was writing, it was becoming increasingly unlikely that many hitherto undifferentiated objects awaited in the natural world for art historians, uniquely qualified by virtue of their sensitivity and training, to come along and declare them works of art. This is even moreso in the twenty-first century, especially for the undergraduate student in a typical survey course, in which any objects or works to come under discussion have already been declared works of art by virtue of having hung in museums for decades or having been included in the latest contemporary art texts. The material of art history already comes pre-constituted, as it were, under what Jonathan Culler calls “the hyper-protected cooperative principle.” In simple communication, we assume that our interlocutors are trying to cooperate with us, that is, make sense, even if we don’t immediately understand their vocabulary or idiomatic phrasing. In the case of literary texts, especially obscure or difficult ones, the assumption is that they are worth study, if only by virtue of being on a course reading list.[11] When a work of art comes to us already in an art history textbook, a museum, or other consecrated artworld space such as an art fair, biennial, or remote location (e.g. Robert Smithson’s Spiral Jetty or James Turrell’s Roden Crater), Panofsky’s aesthetic recreation never comes into play; indeed, aesthetic sensitivity is not required at all. Art history is merely a form of archival (in the case of historic art) or gallery- or biennial-going (in the case of contemporary art) archeology.

This is crucial particularly in the area of contemporary art, where aesthetic criteria are not even a consideration, and the status of the work of art as such is beyond question. How is one to even know if the object differentiated from the natural word is even a work of art? The answer is one cannot, and the student who studies this area has no recourse but to simply accept the dogma of their professors and their textbook materials. As Mark Kingwell writes, “Art is simply whatever the art world talks about.”[12] Under such dogma, the study of contemporary art operates not so much under a hyper-protected cooperative principle, but a hyper-protected conformity principle.

Like the evaporation of the American frontier in 1890, art history is a closed book.

[More on "The Crisis in Art History" here.]

[1] For the firewalling of living artists from Old Masters in a kind of farm-club system of museums in Paris in the nineteenth century, see Terry Smith, What is Contemporary Art? (University of Chicago, 2009), pp. 39-40.
[2] On the mania for Old Masters and the formation of collections that served as the basis for several large public museums in the U.S., see Meryle Secrest, Duveen: A Life in Art (University of Chicago Press, 2004).
[3] Quoted in Jean-Philippe Antoine, “The History of the Contemporary is Now!” Contemporary Art: 1989 to the Present, ed. Alexander Dumbadze and Suzanne Hudson (Wiley-Blackwell, 2013), p. 32.
[4] On the “museum taboo,” see Boris Groys, “On the New,” RES: Anthropology and Aesthetics, no. 38 (Autumn 2000), pp. 5-17, reprinted with minor modifications in Boris Groys, Art Power (Cambridge MA: MIT Press, 2008), pp. 23-42.
[5] On the unsuccessful campaign to persuade the Metropolitan Museum of Art to collect and exhibit illustration art, see Michele H. Bogart, Artists, Advertising, and the Borders of Art (University of Chicago Press, 1995), pp. 43-47.
[6] Arthur C. Danto, After the End of Art: Contemporary Art and the Pale of History (Princeton University Press, 1997), p. 5.
[7] See a previous blog post.
[8] Raymond Williams, “The Creative Mind,” The Long Revolution (Columbia University Press, 1961), pp.
[9] See Patricia Mainardi, “Art History: “Research that ‘Matters’”? (pp. 305-307) and Pepe Karmel, “Just What Is It That Makes Contemporary Art So Different, So Appealing?” (pp. 318-327) in Patricia Mainardi, “The Crisis in Art History,” Visual Resources, vol. 27, no. 4 (December 2011), pp. 303-343; quotes from pp. 305, 306, 320, 323, and 326.
[10] Erwin Panofsky, “The History of Art as a Humanistic Discipline,” in Meaning in the Visual Arts (Doubleday Anchor, 1955), pp. 1-25; quotes from pp. 14, 20, 16 and 12, respectively.
[11] See Jonathan Culler, Literary Theory: A Very Short Introduction (Oxford University Press, 1997), pp. 25-26.
[12] Mark Kingwell, “Art Will Eat Itself,” Harper’s (August 2003), pp. 80-85; quote p. 82.