PictureiPhone cover, Oak Park Mall, Overland Park, KS
My parents left the South for good in the mid-1950s, having grown up, met, and married in Georgia, then having followed my father's education and work to Tennessee, where my older brother was born, then heading West and ultimately North for the rest of their lives.  We drove from Illinois down to Georgia almost every summer when I was a child to visit family (only much later as an adult did I realize that most Yankee snowbirds only head South in the winter).

I remember seeing, in the 1970s, little roadside tourist tchotcke shacks along the small two lane highway, starting somewhere around Tennessee, if memory serves. We never stopped at these places, which looked like a strong wind would blow them down.  But from the car you could see their wares proudly displayed, including the ubiquitous confederate flag beach towels that flapped on makeshift clotheslines. Those towels felt like the totem of a border crossing to me:  they marked your entry into The South. My folks certainly had enough connection to the confederate past either to feel shame or closeted pride about (my great great grandfather was a lieutenant in the confederacy who survived to go home and become the mayor of the town my father was born in, along with baseball legend, Ty Cobb). But they never talked about that past with us, and they never ever had or displayed any images of the confederate flag. Those beach towels were meant for so-called "redneck,"  lower class whites I suspect. And my parents had grown up somewhat solidly (despite the intervention of the Great Depression) middle class. But I don't mean to sell my parents short; while it's perhaps impossible to avoid imbibing raced thinking growing up in the US,  my parents did a pretty good job of keeping us open minded, and we all had diverse friends growing up. My father sometimes felt a post-civil rights era defensiveness about slams against the South (he used to say he found Boston more segregated than anything he had seen back home after we lived there in the early 1960s). But my parents--and especially my liberal mother-- fostered pro-civil rights values in us.

It's been awhile since I've seen the confederate flag used to adorn an object openly for sale, but in the above photo my husband holds an iPhone cover we saw in a mall kiosk in Kansas this week. It was the sole one, sandwiched in between the rhinestone encrusted ones, and polka dotted ones, and brightly colored ones. To many folks, that flag is a US swastika and should be banned. And I wouldn't argue against that approach. Clearly States should be forbidden from displaying a flag with the confederate flag in it. But I take a more skeptical "free speech" approach when considering an average person displaying it.  However wrongheaded and ahistorical it may be, I think the flag is sometimes wielded by white Southerners as a relatively amorphous sign of (somewhat defensive) "Pro-South" sentiment. That is, not as "yay, slavery and genocidal racism!" but more as a sign that has come to mean the South qua the South. Context is clearly important in reading the deployment of a sign. But maybe the sentiment at stake in displaying the confederate flage truly is a "Pro WHITE South" sentiment. Which is why I could never fully defend the flag as "free speech" rather than hate speech. 

So how to read the odd conflation of this cover for an expensive, highly sophisticated phone, with global reach both in its manufacture and renown, and a shamed, redneck-identified, burlap sack of a symbol? I was shocked to see it innocently nestled in an innocuous mall kiosk.  In part, it made me worry about this place my husband has moved to, where I now spend half my time. The demographics skew heavily white, and the political climate is scary conservative: KS is one of those states that tried to keep President Obama off the reelection ballot in 2012, using the "Birther" smear that he wasn't born in America (which is pretty clearly code for "he ain't white") While a private citizen filed the complaint, KS Secretary of State Kobach said: "“I don’t think it’s a frivolous objection. I do think the factual record could be supplemented.”  That's enough context to make me shudder when I see a piece of plastic for sale with a confederate flag on it.

 
 
PictureScribner's Original Cover
One of the dubious pleasures of aging is that you get to watch the return of (purportedly unique) cultural trends and their attendant hype.  I saw the last hoopla-ed (and maligned) film version of "The Great Gatsby" in 1974, and today I saw the similarly hoopla-ed (and maligned) Baz Luhrmann version.

Aside from the fact that 'Gatsby" has long been required reading for any American who undergoes institutionalized education, it was always my text.  I had read it 5 times by the age of 22, and revisited it at least once again.  There's some truth to Kathryn Shulz's  provocative trashing of the book in New York Magazine. There is a kind of priggish moralism on display (though part of Fitzgerald's point is to show how Midwesterner Carraway is forced to reassess that type of moralism), and yes, a shallowness.  But I read and loved the book as poetry. (To wit, Fitzgerald's description of Gatsby & Daisy's first kiss: "he wed his unutterable vision to her perishable breath.")  When I saw the Elevator Repair Service theatrical version, "Gatz," in which they read the entire book over a 6 hour performance, I was surprised at how much of it I remembered by heart. And I was surprised that the gorgeous language of the novel could be made plausible again, as it staged the characters' actions.

I wanted to be Fitzgerald when I was a teenager:  a Midwesterner who headed East,  writing and drinking his way to Europe, madness unavoidably at his heels.  More than the flapper dress, I loved the white flannel suit, the Leyendecker Arrow collar ads which I collected images of from old magazines. The first novel I began at 16  (and predictably abandoned) took place in the 1920s, with girls jumping into fountains, etc.  Write what you know, they say; and for literature, the Lost Generation was what I knew.



PictureParamount Pictures, 1974
So when the 1974 version came out, I was ecstatic, especially since I had a mad crush on Robert Redford (who had replaced Humphrey Bogart and Clark Gable in my affections, before being subsequently replaced by Jack Nicholson). My stern father, normally a stickler for abiding by the movie rating codes, graciously agreed to take me to see it since it was rated R and I was underage. I remember being swoony over Redford in the pink suit, but I think the movie disappointed me ultimately, as all movies based on books do. It simply couldn't match the poetic suggestion of written language. (And I didn't get Mia Farrow, since I hadn't seen "Rosemary's Baby" yet). I was only 13, but bookish enough to realize that I had given my father a Freudian scare when, at the sight of Redford's profile on the screen,  I leaned over and said "he looks like you." (His respons: "don't say that." ) And like the abounding Brooks Brothers and Tiffany's tie-in ads for the current Gatsby, there were tie in-ads with art deco-esque pictures of Redford and Farrow for Ballantine's Scotch which I clipped and kept, and a cover story in Time magazine.

20s images floated around in the 70s more generally. Cloche hats returned; you could get t-shirts printed with images of flapper girls and Betty Boop.  And 1974, with its psychic devastation of prolonged war, its economic confusion of inflation, stag-flation, expensive fossil fuel, etc. (the audience gasped when Redford-Gatsby gets his gas bill at the Ash Heaps pumps: 10 cents)--resonates with 2013.  Perhaps times like these crave their Gatsby--survivor of the Great War and indefatigable striver;  crave the re-rehashing of the bittersweet narrative of the, you know, "American Dream."


PictureWarner Bros., Village Roadshow Production (2013)
So what of Baz Luhrmann's iteration? When I first saw the trailer last December, I felt a proprietary shudder.  THIS is what is to be made of the American ur-text:  an impossibly shiny, special effect-studded musical (in spirit, if not in actuality)? It looked too bright to me. The best part of Maureen Dowd's recent (and somewhat unexpected) column on the film was her quote from New Republic editor Leon Wieseltier who astutely noted the the problem with “Gatsby” movies, “is that they look like they were made by Gatsby. The trick is to make a Gatsby movie that couldn’t have been made by Gatsby — an unglossy portrait of gloss.”  But Luhrmann announces this pitfall by embracing it upfront: he opens the film with a black and white, silent movie-looking version of the gold deco-framed titles, with the initials "JG" liberally inscribed throughout, suggesting a production of Gatsby, by Gatsby. What follows isn't completely uncritical of Gatsby, but the film lends its cinematic persuasions to Gatsby's wizardry of self-construction. Gatsby's legendary first appearance at his own party is unabashedly announced by fireworks, which frame his wry smile. The movie simultaneously frames the story as clearly textual:  Nick Carraway is shown typing the novel, ostensibly for therapeutic purposes as he recovers from witnessing Gatsby's dream turn to nightmare in a sanitarium. (While this framing device feels a bit creaky, metaphorically, it's actually quite rich: Nick, our everyman, our naive Virgil to Gatsby's inferno, goes to rehab to get over his (and, by extension, our) American Dream hangover).

The sanitarium, like all the exteriors in the film, is clearly computer-generated.  While these constructed exteriors lack the suggestive grace of Hitchcock's beloved mat-drawn backdrops and soundstage set exteriors, they function similarly:  to emphasize the constructed, indeed Disney-cartoonish nature of Gatsby's world.  And not just Gatsby's world, but also Nick's version of New York City, with its rocket-speed motorcars and utopic skyline.  I watched the regular version, but this is one of the few movies I've seen that made me wish I had seen the 3-D version instead, in part to see the digital towers of Gatsby's Xanadu fully "real"ized.

Race and racism is key to the original text of "The Great Gatsby," both condemned by Fitzgerald (e.g. in Tom Buchanan's embrace of scare narratives of the rise of "the colored empire") and performed by Fitzgerald (e.g. the caricatured language in the description of a car carrying African-Americans across the bridge to NYC). Luhrmann's version makes some interesting choices in this regard, for example  in casting a Meyer Wolfshiem so transgressively Jewish that he is in fact Indian (played by legendary heartthrob of Indian cinema, Amitab Bachan).  Using the Jay-Z soundtrack, which mines hip hop narratives of the nouveau riche  to celebrate Gatsby, including Beyonce re-mixes a la Josephine Baker is another interesting choice.  But Luhrmann misses the opportunity to expand the vision when he keeps Gatsby's parties lily white (but for servants and the jazz band).  It would have been masterful and completely consistent with the book to people Gatsby's parties with a truly diverse crowd, but for some reason Luhrmann sticks to the formula of lending black "cool" to whites through the soundtrack.

Ultimately, the movie is surprisingly literary. Much of the dialogue is taken verbatim from the novel.  And  while the device of Nick writing the film's story as we watch it unfold is a bit gimmicky, especially when occasional lines of text appear on the screen, we are repeatedly reminded that this is a story that is being made up.  Even when a line from the book isn't used--like the purpleish prose of Gatsby's kiss quoted above--it is suggested in the behavior of the characters, a kind of insider's nudge to the Gatsby obsessive.  What rather successfully and entertainingly emerges from this mix of textual faithfulness and Disneyfication is a recognition of the essentially constructed fantasy that is Gatsby and his role as skipper of the (not so) good ship, The American Dream.

 
 
Picture
George Jones on my iPhone
While I had Southern parents, I didn't grow up with George Jones. My parents didn't listen to country music as far as I heard. The only real
Southern music traditions they brought into the house were my father's love of Dixieland jazz and the hymns my mother used to sing at the stand up piano we had for most of my early childhood (though it was the act of her singing hymns in the living room that probably felt more Southern to me than the hymns themselves, which were Methodist). Growing up in a Midwestern college town with literary aspirations, I avoided country music. It seemed hick and corny to me, something my farmer's sons and daughters schoolmates must listen to on their tractors and combines.  I felt surrounded by "countryness" in those days, and longed to be on the east coast or in a bustling city.  I listened to old timey crooners in my early teenage years--Julie London, early Barbra Streisand covers of jazz standards, Nat King Cole. And later Billie Holiday, whose records I used to play at night to lull me to sleep.

George Jones found me in my late 20s, when I was living in upstate NY. I had been invited to some friends' party in the snowy surrounding countryside. They asked if I could pick up another one of their friends who didn't have a car. I was glad to as I was an uneasy driver. It turned out the person they wanted me to pick up was a man I'd had a crush on for years. He was from rural Kansas, had a wheat shock of hair which waved above huge blue eyes and a cartoon nose, a facial flaw that made him seem all the more handsome. He had a genuine twang and seemed to always be laughing at some private joke, which maybe had you at its center. We headed out to the party and played nice, tried to act like strangers, but in our small town we knew a lot about each other already without having met. I don't really remember the party, but I do remember we took off early and headed back to town. He invited me up for a drink, and I said yes, floating in the surreal feeling of unspoken dreams answered.

For the next few hours, we sat in the dark, our feet gingerly propped on the open door of the electric stove, its red glowing heat element a makeshift fire, while George Jones played on the boombox. "The Grand Tour" was my favorite that night, and remains so. The elaborate domestic narrative; the impossibly smooth, agile, yet wiry voice of Jones--able to work more syllables into a word than anyone.  Later I would discover his early scratchy recordings where he sounds all nasal cavity and Hank Williams (e.g. "Why Baby Why?"), the consummate heartbroke hillbilly. Those early recordings are beautiful in their own austere way.  But the peak career recordings of "Tender Years," "From the Window Up Above," "Take Me" etc. most transported me. When I say his name, my voice lapses into an echo of my mother's Georgia accent. (And I may be the only person on Earth who retains a copy of his autobiography). Has ever a man sung pain as openly as George Jones? Not the angry broken heart of rockers, but the self-abasing lament of a self-acknowledged sucker. A well meaning but beer-soaked loser. Hank Williams always seems to have a tongue more fully in cheek about lovelorn matters (e.g. "Move it on Over") or a more purely nihilistic kind of loneliness ("I'm So Lonesome I Could Cry"), but Jones' suffering always feels right here on earth, the next bar stool over.  Self-pity carved into a gorgeous-throated tool by that elegant twang.

My relationship with the Kansas boy never worked out; his attachment to Jones perhaps should have been a clue to his own broken nature (the NY Times obit devotes almost 3 inches to Jones' addictions and wild days.  The trouble started early:  he was born with a broken arm). But George Jones has stayed in my life ever since. When I got the NY Times alert on my phone today, I was sad to see Jones was dead.  But I was happy to see the Times at least knew his passing was well worth a news alert. 

 
 
Picture
The Star Captain Luchadores of Tamale Spaceship
When I first visited New York City as a teenager in the mid-1970s, one of the things that fascinated me was the street food.  I think I had my first knish out of a street vendor's dubiously hygienic steam drawer. (And maybe my first Yoo-Hoo). Street food seemed at once sophisticated and practical to me--how romantic to grab a bite on the streets/heartbeat of NYC; how impossibly convenient to get lunch in under a minute.

Chicago is not a street food kind of town, in part due to city regulations, but more recently, food trucks have been allowed to operate and have been embraced pretty enthusiastically.  As I was on the bus headed downtown a couple of weeks ago, I spied the signature silver bullet-like "Tamale Spaceship." Out of a barely adorned, utilitarian looking truck, they serve the kind of food that makes you walk that extra mile, so I jumped off the bus 2 stops early. My pick is typically the simple rajas con queso. The guys are tolerant of my pidgin Spanish; I remain their "amiga" regardless.  I'm always conscious of having a great city moment when walking away from the Spaceship with a bag of hot tamals.
 
 
Headed down to work in the Loop the other day, waiting to cross at a light on Dearborn Street, I looked North and for a moment felt like I was in an urban bike paradise like Copenhagen. While Chicago has a great lakeshore bike path, biking in the city itself can be pretty hairy.  But da Mayor has been supporting a bike initiative to support bike commuters who work downtown. Accordingly, some of the nicest European-looking bike lanes in the city have sprung up on Dearborn.
Picture
Dearborn & Van Buren, looking North
I like the way this 2-way design allows some car parking (not previously acceptable on Dearborn) and uses the cars to protect bicyclists. There are also special lights for bicyclists. You can read more about the "cycle track" here. And check out this cool POV video of a ride on Dearborn's protected path. (Contrast the pro-bike tone of these Grid Chicago articles with the Tribune's predictable "city of big shoulders" negative coverage. Favorite man-on-the-street quote: "I wish I had time to pedal around, but I have kids at home to feed.")

I'm mostly a weekend, fair weather biker.  I take public transportation to work.  But I spent a couple of weeks in Stockholm years ago, with a bike as my main form of transportation.  I biked all over that city (and mostly alone as the friend I was visiting was in medical school at the time). It took me awhile to get used to riding in a city that treats bicycles like a legitimate form of transportation.  While waiting at a stoplight in a major hub of the city, Ostermalmtorg, I attempted to lead off early during a red light, to get the jump on the cars behind me.  A pedestrian crossing the path nearby shook his fist at me, muttering a Swedish epithet. I was nowhere near the guy, but I instantly realized this was a place where bikes received the same respect (and thus disdain) as cars. Car is King in the U.S., and Chicago is a shockingly car-friendly city, parking-wise.  It's great to see the city innovate like this, using good design.
 
 
Picture
from www.Izismile.com
Count yourself one of the last American pop culture luddites if you have not heard of "Honey Boo Boo Child" (AKA Alana Thompson).  Honey Boo Boo is the 6 year old star of TLC network's "Here Comes Honey Boo Boo," as well as the butt of seemingly every joke of 2012. Maroon 5's Adam Levine called her "the decay of Western Civilization" (unlike, you know, pablum singing competition shows). She is frequently spoofed on Saturday Night Live. Even actors I respect, Christopher Walken and Sam Rockwell, have gotten into the act, reading lines from the reality show like theatrical dialogue (one hopes they didn't know where the gag was headed when the script was thrust in their hands).  The mean-spirited nature of some of the comments posted online about this child is truly shocking. And hypocritical:  how can you claim the moral high road on the exploitation of a 6 year old when you're calling for the death of her and her family?

As a cultural critic (and, OK, fan) of reality TV, I've been watching the furor with some interest. Honey Boo Boo:  The Phenomenon arguably encapsulates a hierarchical hypocrisy in commercial mass media. Not surprising that other reality stars like Adam Levine desperately try to separate what THEY do on TV from the Southern-white-trash-low-class-roadkill-eating-pageant-attending folks of Honey Boo Boo's world. Honey Boo Boo is an apt tool for drawing what appears to be a new line between "high culture" reality TV and "low culture" reality TV.  And, no surprise, that line is partly drawn based on class.  The solidly middle-class/aspiring to celebrity class world of shows like The Voice and American Idol is, apparently, good wholesome fun.  (And the Richie Riches of "The Real Housewives" series don't arouse as much virulence as Honey Boo Boo).  Somehow spending time with a poor white family with modest pageant aspirations is exploitative and decadent.

Reality check:  Television in the US is a commercial medium and has always been so, long before reality TV. Other nations developed their television system allowing more access to non-profit and public broadcasters, so it didn't have to go that way. But it did.  Are "non-scripted" reality shows really any more commercial and exploitative than any other shows?

What got me to finally post on this was Jodie Foster's Honey Boo Boo slam in her meandering speech at the Golden Globes, where she was awarded a lifetime achievement award.  I've always been a big fan of Foster, and maybe feel a bit connected since she's about my age. (Long ago I was slated to give her a tour of the Cornell University campus, where she was thinking of going to grad school after Yale, to follow my then professor and boss, Henry Louis Gates, Jr. She skipped Cornell and stuck with Hollywood). Anyway, while it's difficult to know what all was going on in the crucible of Foster's soul when she was speaking, in sketching her long career (and understandably pleading for privacy), she defiantly proclaimed:  "I am not Honey Boo Boo child." The point was, I gather, I am not going to exploit my personal life for economic gain.  Yet it came off sounding simply superior.  Of COURSE Foster and Hollywood glitterati like her are atmospheres above the hoi polloi of reality TV. Honey Boo Boo probably won't get cast in groundbreaking cinema; she won't get a chance to go to Yale and turn down a chance at an Ivy graduate degree (frankly, her closest shot at that is probably by being in a reality show). She's not living in a family that knows how (or perhaps can afford to) feed her in a healthy way so she can be acceptably skinny by today's standards. Etc. etc. etc.

Picture
from www.izismile.com
Underlying the sense of exploitation that some people feel in reality TV is perhaps that these are real people on TV; they are not employed as actors, so somehow they are a.) more vulnerable and b.) more pimped. Yet how can we say this is the case anymore than with actors, whose performances and public perception can also damage them and their private lives.  For better or worse, reality personalities are performers.

And for better and worse, the internet, DIY culture, and reality TV have all helped to foster a deeply performative, representation-obsessed society.  Who doesn't have a YouTube channel nowadays, on which they broadcast their low budget productions and kids antics? Our new dictum is, "I perform, therefore I am." A self-conscious sense of the performative pervades our lives. If you have any doubts, check out the elaborately staged performances in the wedding proposals videos on HuffPo

When I think of the reaction to "Here Comes Honey Boo Boo," I remember my mother, Georgia-born with a Southern lilt she never shed, who was offended by the casual use of the word "hillbilly" in her day, and shows like "The Beverly Hillbillies" which extrapolated in her mind to make all Southerners look like idiots.  Or my father, also Georgia-born, who was so gratified by the election of President Carter, in part because it put an articulate Southerner in the public imagination. Maybe "Here Comes Honey Boo Boo" contributes to negative views of white Southerners, but how much of that reaction is the responsibility of the viewers' own negative perceptions of the South?  If you think there is any value in reality TV (and I think there is--that around the scripting and the screaming and the grotesquerie, something human often (but not always) emerges), why is this family any less worthy of being watched than the Kardashians? Why is this little girl any more exploited than any child actor on TV?  Crucially, like all performers, she is also a human being.

 
 
Picture
"Our President" by Jim Moran on its way to ArtPrize (photo: Jim Ronan)
**2012 is fast on its way out, and my backlog of unfinished blog posts is a bit daunting.  (I smell a 2013 new year's resolution: MORE TIMELY BLOGGING!) In any event, here's my belated post on the biggest art fair I went to in 2012:  ArtPrize.

September 9, 2012

I headed to Grand Rapids, MI last weekend to check out the art fair that GQ magazine has dubbed "Art Idol":  ArtPrize, a terrifyingly democratic indoor/outdoor art show where we, the people choose the winner. And what a winner:  first prize is $250,000, which is a hell of a lot of money to give one person, even a starving artist. ArtPrize is the golden calf sacrificed on the alter of the old saying "I don't know art, but I know what I like."

GQ's article on the fair, published last month about the 2011 ArtPrize, pissed off a lot of Michiganders and, er, Grand Rapidians. I imagine some of the negative reaction was the reverse snobbism often in evidence in the Midwest: "them hi-falutin' Eastern journalist folks:  what do they know?" etc. etc. etc. In my mind, reverse snobbism is no different than regular ol' snobby snobbism.  But the article pretty much gets ArtPrize right:  a lot of really crappy, crafty, tacky, kitschy work shows up.  Fairgoers ooo and ahh over silly virtuosity, like a "realistic" baby seal carved out of black wood that made it to the final 10. Ish.

Picture
ArtPrize Top 25: "The Nanny," Monica Walker
One of my favorite examples of the type of kitsch that gets ArtPrize votes is "The Nanny," which was voted into 2012's top 25 (but thankfully, did not make the top 10). The GQ article quite rightly made much of the fact that 2011's winner, Mia Tavonatti's "Crucifixion," was a kitschy,  luridly colored "surfer Jesus."

Like I said, the GQ article got it right, snarky or no. But what open air "art" festival doesn't have a motherlode of shlock?  I've been to plenty of well-respected ones in Chicago where what might qualify as real art is in short supply. The skills may be there, but the vision is tired, geared to selling something that fits into a livingroom color scheme. Of course those fairs are targeted at selling work; ArtPrize allows viewers to "buy" with a vote: democracy as consumerism, a thoroughly American concept. Overall, I'd say truly compelling art is rare, no matter where you look, while moderately engaging, technically virtuosic, crafty work is common.

Picture
What remains of "Captivity" by SinGH
The controversy of ArtPrize this year was the desecration of a work of "art" by its businessman patron (basically, to submit work you have to have a venue sponsor). The piece in question consisted of an effigy of Saddam Hussein hanging from a noose in the cage at left.  The patron left the cage, but cut down the effigy and threw it away. You can read about it here. While of course such behavior is appalling, it's hard to get worked up about it as the piece itself was equally appallingly bad.

Picture
"UnNatural History," Blane de St. Croix
But what I really want to focus on is some of the incredible art that did make its way to ArtPrize, largely through the auspices of the extraordinary curatorial group Site: Lab, and the Urban Institute of Contemporary Art.  I saw some of the most interesting work I've seen in years at ArtPrize, thanks to Site:Lab (the well-deserved winner of the ArtPrize 2012 juried venue award) and UICA.

Site:Lab, which is dedicated to fostering site-specific work, took over the old Public Museum in Grand Rapids, offering its artists full access to the old natural history vitrines and taxidermied species.  The work, occupying two floors of the museum, encompassed all media, and was by artists from Detroit, Brooklyn, and everywhere beyond and in-between. The thread that held the pieces together was the examination of the kind of presentation and display of the "natural" world that old-school natural history museums like the Public Museum represent.   The entry to the hall was clouded over by the hanging land masses of St. Croix's piece (pictured left and below), with alternating lakes and scorched vegetation on both sides of the terrains.

Picture
UnNatural History, St. Croix
Picture
a vitrine from "The Reptile Room," Scott Hocking
Quite a few pieces took Detroit as their subject (some of the artists were from Detroit).  Works re-imagined Detroit as re-subsumed by nature, with rusted cars and auto pieces assuming their rightful place in vitrines which display the relics and species of the past.  Scott Hocking's installation was presided over by a giant rusted 1950's Chevrolet, seemingly shipwrecked in a sea of salt in the middle of the room.  He worked auto parts into display windows with old books and stuffed birds from the Public Museum collection.

Picture
A vitrine from "Displacement," Design 99
"Displacment," the winner of a juried prize, by design collective Design 99, was a series of vitrines displaying materials recovered from a house in Detroit, spanning 100 years.  The one pictured here contained garden tools.  There was a display devoted almost entirely to Pope calendars, among many others.

Picture
From Alois Kronschlaeger
By far my favorite piece at Site:Lab's Public Museum site was "Habitat" by Alois Kronschlaeger (which ended up winning the juried 2 dimensional award).

Kronschlaeger utilized the Public Museum site brilliantly, interrupting, remaking, and otherwise fucking with the dioramas of flora and fauna. The image at left is a mountain lion whose rocky perch has been upholstered in yards and yards of industrial carpet.

Picture
Inside the Diorama, courtesy of A.K.
Kronschlaeger's installation allowed me to realize one of my life's ambitions:  to be part of a diorama. He constructed a shiny runway into the belly of the naturalistic beast, which allowed viewers to enter and see the world from the pov of a stuffed bison. Very cool. (You can see art critic Jerry Saltz's diorama pose in the Instagram stream on the Site:Lab website.)

More pix from A.K.'s installation below.

Picture
from Habitat, by Alois Kronschlaeger
Picture
from Habitat, by Alois Kronschlaeger
Kronschlaeger intervened in the vitrines with metal netting, carpet, and other materials.  In the above image, he built out a wooden grid; an orderly architecture emerging from a space that seeks to order nature.

The Site:Lab installations were just brilliant. Their revisionist archeologies, histories, and architectures felt like timely takes on material culture's decay into the natural, and nature's decay into something like the manmade, as well as the decay of the concept of the natural itself. If anything emboldens me to brave the giant president heads and carved hedgehogs next year, it'll be Site:Lab.
 
 
Picture
I don't often comment on internet articles or blogs; maybe because it's just a little too 'brave new world' for some of us who grew up our entire adult lives without the Internet. And for me, cybertext is really never quite the same as text:  printed (or at least printable), physical, material.  But I made the mistake of adding a reply comment to an academic blog recently. I won't bore you with the deets, but suffice it to say that the blog post was about a group of academics who had thought themselves quite righteous in firing off a letter to a dean at an elite institution. I didn't sign the letter when it circulated because I found the issue at stake trivial and the gesture of a letter overreaching and busybodyish.  But when I checked out the blog post, I felt compelled to defend some anonymous posters who were taking heat for, like me, NOT signing the letter.  I was surprised that the snark factor I associate with 14 year old girls is now apparently standard issue, even among academics.The folks who didn't sign were posting anonymous comments, which made sense to me as they were effectively in the minority in the comments, and were commenting in a relatively tight community of academic specialists.

For someone who blogs, to decry comments on blogs is probably more than a little hypocritical.  I certainly enjoy getting comments here.  And while I don't comment often, I am fascinated by other people's comments.  Sometimes it's the content, but often I'm amazed by the postures people let themselves adopt, and by the sheer ugliness people reveal. (The comments on New York Magazine's "The Cut" fashion section are pretty much always good clean fun; allusively witty with the snap, camp, and fun of a fantasy cocktail party).

So what was wrong with my commenting?  Predictably, in short order my comment was replied back to with a couple of comments ranging from self-righteous to snotty (what breadth, eh?) I tried to keep my tone even and non-snarky in my original comment, but in reality, it was a defensive comment, even if leveled at an offensive comment. The replies it received were so annoyingly over-the-top, that I just slumped back to my invisible lair, kicking myself a bit for imagining that I could enter the fray without being bruised; wondering what my true intentions in commenting were. How much was I, just like everyone else in the comments, striving to be (ho-hum) right?

In theory, there's a lot to like about the possibility of an open forum for critique, sharing, etc. that online commenting allows.  But of course it's old news that people often hide behind anonymous, disembodied postings and hurl epithets and hate therefrom.  My experience the other day showed me that even when people are NOT anonymous--i.e. these professors who posted their names as they self-righteously flamed me--there is still a tendency to be unkind and self-convicted.  Maybe there's more going on here than Internet anonymity. It occurred to me that another factor could be a kind of confusion of text and speech at the core of many of our new, "social"  (or "anti-social," as the case may be)technologies.  There are plenty of things we might write down, but would never say to an actual flesh and blood human being. In part, writing is not speaking: its structure is often too stiff to equate with slangy speech. And of course you're usually without audience, save for your own genius when you write something.  Writing is solitary, even perhaps when ultimately meant for a knowable audience.  So while you're drafting your brilliant blog comment, you may have in mind a "response" to someone, the brevity and immediacy of which feels speech-like.  But in fact of course, you're composing a text.  Even when writing isn't unkind, can it really ever avoid being pontificating? As many a brilliant 20th century Frenchman has already established, writing marks an absent self, is an absence that would be presence, and so missteps of grace, and even morality, are perhaps inevitable.

"LOL" is a classic example of new technology's confusion of speech and text.  This textual symbol, used in writing, paradoxically evokes a speech act:  laughter.  "LOL" is just one of the affects of emails and text messages that apes speech, rather than text.  When marks like this first began, it struck me as odd that the first analogy people were reaching for in respect to email was speech rather than writing. And of course "text" messages are barely text at all: emoticons, acronyms, execrable grammar, ilegible spelling--many texts are a veritable cry of "phleebijeeeppurg" rather than actual language.  While that archaic technology, email, used to also consist largely of this kind of "speechlike" brief and inscrutably written missives, it seems that increasingly the email I get at last resembles letters, a more obvious analogy than speech. Maybe that's due to the primacy of text messaging, the latest "writerly" speech act.

I'm not one of those people who subscribes to the idea that "everything happens for a reason," but the other day when I was pondering these speech/text slippages, I happened upon Jean-Luc Godard's early film, "Vivre Sa Vie" on TV.  In addition to Godard's characteristic political aura, alienating editing choices, and inscrutable character motivations, "Vivre Sa Vie" is also strikingly beautiful. While he's messing with classic Hollywood left and right, his compositions and light still feel under the sway of Hollywood's best black and white.  Anyway, there's a scene where Anna Karina runs into "le philosophe" in a cafe, and a discussion of language ensues.  The philosopher is played by Brice Parain, a lesser known star of the crowded French philosophical firmament. Check out what he says about language and speech ( it really gets going at 5:25):

This notion that "one learns to talk well only when one has renounced life for a time"; that one cannot speak well until viewing life from a state of "detachment," from going through a time of the death of speech feels very instructive to me when contemplating the kind of casual brutality that takes place on legions of Internet blogs day after day. We have to speak to be human, Parain says; but speaking well, without harm, requires a kind of renunciation from speech, at least occasionally. The "ascetic rule" Parain mentions sounds very much like the Buddhist notion of "right speech"--speech, the Buddhists emphasize, not text. (Though Buddhists don't seem that big on texts, often focusing on a more immanent, internal kind of knowing--the 'beyond language' thing I'm never quite comfortable with).  Parain, "le philosophe," chain-smoking Buddha, resigned to the endless, seductive babel, comforts us, counsels us on the best speech, breaking it down to complex poetry, in the great French tradition.
 
 
Picture
Macy's, State Street, Chicago
Less than a year after Occupy Wall Street, and Occupy Chicago took to the streets, commercial cooptation of the images of true grassroots protest OWS put back on our televisions (when they got coverage, that is) is in full swing.

The masses envisioned as shoppers is, of course, nothing new. Clearly the marketing crew and window dressers at Macy's are citing the activity that was taking place just a few blocks west, on La Salle St. last fall in this "back to school" display.  Here, the colorfully preppy mannequins hoist their placards to boldly protest that they like stuff (you know, in a Facebook, thumbs-up kinda way):  "vintage charm," "color theory" (that much-maligned ideology!), etc.  My favorite:  "we like uniforms"--undoubtedly a reference to school uniforms.  It's some kind of watershed in late capitalist signification when the tropes of non-conformist social activism are brought in to embrace the uniform, emblem par excellence of conformity.

I was a huge diorama fan as a kid; even used to build my own crude stagings of life in a box at home.  So I'm always keen on window displays.  Macy's in Chicago (and alas, the erstwhile primum mobile of Chicago mercantilism, Marshall Fields, which it supplanted) have always had pretty lame-o windows--especially around Christmas, when the lily whiteness of the vignettes extends way beyond snow to every person depicted. (If you can't figure out how to do an inclusive window display for Christmas, with elves, toys, animals, and magic at your disposal, you ain't ever gone figure it out). As for Occupy Chicago, they remain in session: http://occupychi.org/


 
 
Picture
As I posted previously, I am in the arduous process of moving.  So it's finally time to face the music on all of the boxes and piles of deferred papers which preside over my study.  As part of this effort, I have been forced to revisit a few remaining boxes of my mother's things, which I've held on to since she died almost 5 years ago now.  I was my mother's executor, which was a lot of work for a modest amount of money--but good work, as it gave me something to do after her sudden death; a way to help with death post-death.  She made me her "archival" executor as well, placing me in charge of her journals and photographs, and the not insubstantial collection of paintings she had produced earlier in her life.

I think my mother saw herself as a potential world historical personage; someone who almost was an opera singer, a famous painter, an art historian of note.  But for her times, but for her controlling husband, but for her 5 children, etc.  And undoubtedly the expectations for a woman born in 1931 were decidedly domestic. (Letters from her Southern mother to her in New York 10 years after her divorce still addressed her as "Mrs.").  Yet she managed to accomplish a fair amount of what interested her:  she traveled, she went to opera, she painted and sketched and took pictures most of her life.  But her aspirations for herself were undoubtedly not quite met. There was a detectable aura of frustration surrounding her much of her life. At the same time, she maintained an extraordinary joie de vivre up until the moment she died.

I distributed her money and paintings, and the more significant items of her estate to my siblings long ago.  But I've been unable to fully resolve the 4-6 boxes of her papers and miscellany.  Every once in a while, I take a stab at going through it, but in the beginning it was very hard to throw away ANYthing that had her elegant, enviably legible handwriting on it.  And just going through the boxes was enormously emotionally exacting. As time passes, it becomes a bit easier to let go of some of the material traces of what, after all, isn't really her or her essence.  And yet, for someone who kept travel notebooks, sketchbooks, and diaries (albeit many quite partial), who hung on to every little trip memento (e.g. for a trip she took to the UK in the early '90s:  a business card for a taxi, napkins from the Tate Gallery, as well as directories, a Tube schedule, wrappers from Cadbury bars, etc.), who somewhat obsessively kept material and textual scraps of her life for decades, there is part of her that does in fact reside in these material fragments.

My mother may not have fully realized how wise her choice of me as archivist truly was.  I am perhaps the closest to her in intellectual and artistic sentiment than any of my siblings.  In some ways, I undoubtedly represent something closer to the life she thought she wanted for herself:  a full-blown career, a writer who has published poetry and hobnobbed with some well-known literati, a woman who held on to her single life until a ripe ol' age.  But I don't know if she knew me as a fellow sentimentalist of the scrap, a self-historicizing keeper of the paper traces of life and its travels. She kept her high school (high school, mind you) papers; I kept my high school papers. (AND she kept her father's high school papers). I have dutifully, nay, somewhat obssessively, sifted through each little piece of paper, newspaper clipping, sketch book, address book, postcard, etc.  For now, my goal is to simply weed out what's truly relevant to her as a person for others from what was relevant to her alone during her lifetime.  (So those Cadbury wrappers have to go.)

If this process has taught me anything, it's that if you hold onto shit long enough, it loses the stink of narcissism and attains the stature of history.  My high school papers: infinitely silly.  My mother's high school papers: intriguing.  My grandfather's high school papers from 1914:  extraordinary.  (Is there anything that doesn't become fascinating seen from the vantage point of 50-100 years later?)

What of my mother's archive; this mother of all archives (for me); this archive that now materially is my mother?  How to handle it; how to disseminate it?  I've tried to adopt a historian's attitude about this "non-historical personage" who gave me life; retaining all writings of note, for example, and the occasional surrounding context (so some of those Cadbury wrappers might be relevant). I've tried to position myself objectively, winnowing out what might be of interest to her grandchildren down the road, as well as deciding what should be censored from the archive--contemplating what might be nobody's business but hers.

What's been perhaps most revealing and heartbreaking has been to find the frequent traces of her attempts to buoy up her deeply self-doubting self:  the stirring quotes she had copied and clipped from newspapers, the notes to herself about self-acceptance, the articles on being single (she didn't remarry after she divorced my father for almost 30 years), the occasional frank diary entries.  And in this mother archive, I find myself:  similar self-doubts, years also of navigating singledom (and even some of the same quotes:  when I came across a scrap on which she had written Emerson's famous encomium to "finish each day and be done with it..," the very same quote was posted on the wall of my study.)  The act of archiving, and the material archived is my mother, my self.