How to get Filthy Rich on Rising Kalliope-I: Earn a Zathua-load of Credits and Become Iron Man!

As we have talked over in class, How to Get Filthy Rich in Rising Asia by Mohsin Hamid is an interesting novel in many ways.  The one element that we kept circling back to in the discussion, however, was the second-person narrative.  It was examined from many different angles.  How does it affect the plot? How does it affect the reader’s feeling towards the narrator?  How does it work (or not work) with the pseudo “self help” portions of the novel?

There were varied points and opinions in answer to all of these questions, but one thing that we could all agree on is that the use of second person “you” did as Hamid intended: it stripped the narrative of many identifying features.  The reader never learns the name of the protagonist, the protagonist’s family, the pretty girl, or the city/country in which the story is set.  Hamid’s intention, we agreed, was to make the story ambiguous in this sense so that it could be applied to any number of developing cities in developing countries around the world.  It was also mentioned that this might be done in order to have the reader check their cultural prejudices and stereotypes at the door — there are always the intuitions and speculations that readers form by the end of the novel, but there is no way to know for sure if what you have deduced is accurate. 

There is another genre of literature that does the same thing, but in a different way.  Science fiction often takes the reader to a different universe or just far into the future in the same universe.  So, in that sense, it positions the reader in an unknown space (pun intended), just like How to Get Filthy Rich in Rising Asia.  In addition, the use of fabricated names for places and people has the potential to further disorient the reader.  Coming across characters called Scr’Vani, Sissk, or OYC-L can have the same obscurity as only calling characters “the pretty girl” or “your father.” 

In terms of offering up a social critique, sci-fi is no stranger to that whole ballgame.  Countless sci-fi authors have crafted distant, future worlds and characters that are meant to critique their local, current world and characters — and some parallels are more thinly veiled than others.  A classic example of this is George Orwell’s 1984.  Another example, perhaps less well known, is the origin story of Iron Man.  Not the character’s origin in the Marvell universe itself, but the idea that inspired Stan Lee to make him up in the first place.  Of his initial inspiration, Stan Lee famously said:

I think I gave myself a dare. It was the height of the Cold War. The readers, the young readers, if there was one thing they hated, it was war, it was the military….So I got a hero who represented that to the hundredth degree. He was a weapons manufacturer, he was providing weapons for the Army, he was rich, he was an industrialist….I thought it would be fun to take the kind of character that nobody would like, none of our readers would like, and shove him down their throats and make them like him….And he became very popular.

Iron Man literally embodied the very war that Americans found so distasteful, and yet he was a success in the 1960’s and is an even greater success today. 

I don’t mean to get political in a blog post. I only bring up Iron Man simply to demonstrate the way that sci-fi/fiction authors can subtly (or not so subtly) offer social commentary in their works by creating parallels to present time — in the same way that Mohsin critiques with his created world.  So maybe How to Get Filthy Rich in Rising Asia is set in Lahore, or maybe it’s set on Kalliope-I.

How to Get Filthy Rich in Rising America

How to Get Filthy Rich in Rising Asia is Mohsin Hamid’s most recent novel and it is certainly causing a stir in the literary world and beyond.  This is Hamid’s third book after  his successes with Moth Smoke and The Reluctant Fundamentalist and, in the spirit of a self-help book, it follows an unnamed protagonist as he moves from the country to the city while moving up the social ladder at the same time.  The novel depicts the protagonist and his love, the Pretty Girl, as they both use their respective skills — entrepreneurship and modeling — to further their social standing.

How to Get Filthy Rich in Rising Asia is unarguably critically acclaimed. The Washing Post called it “Extraordinarily clever” while Time described it as “marvelous and moving.”  The most intriguing review, however, comes from NPR with the statement that How to Get Filthy Rich in Rising Asia is “A globalized version of The Great Gatsby…[Hamid’s] book is nearly that good.” 

I find myself inclined to agree with this statement.  An argument can certainly be made that there are parallels between How to Get Filthy Rich in Rising Asia and The Great Gatsby.  However, these parallels exist in a way that is perhaps not immediately apparent.  The crux of the issue is that, instead of the story opening with Nick Caraway and a Daisy and Gatsby who already have a lifetime behind them, it opens with a Gatsby-like character before he becomes Gatsby.  In other words, the nameless narrator of How to Get Filthy Rich in Rising Asia is James Gatz as opposed to a Jay Gatsby.  In Fitzgerald’s novel, the reader only gets Gatsby’s story second-hand, through the eyes of Nick Caraway.  In Hamid’s novel, however, the reader is treated to a first-hand account of James Gatz started from nothing, fell in love with a girl, and decided to make himself into a Jay Gatsby. 

The number of parallels increases the more I think about it.  To begin, there is a similar air of mystery surrounding the main(ish) characters.  In Fitzgerald’s work, the reader doesn’t know a lot about Gatsby because he is not the narrator but also, and more importantly, he doesn’t want people to know about him.  Hamid similarly suppresses details about the unnamed protagonist.  In addition, the titular characters have both made their fortune in shady ways — Gatsby through his ties with the mob and Unnamed Protagonist through his dubious bottled water scheme.  Furthermore, the climate surrounding the love interests is very comparable.  In both The Great Gatsby and in How to Get Filthy Rich in Rising Asia, the lovers are separated due to issues of disparate wealth and maturity.  In addition, there are also themes of infidelity in both novels. 

In terms of the broader scope of the two novels, both contain rags-to-riches stories, women at the mercy of a patriarchal society, celebrity culture, violence, and poverty/privation.  NPR, it would seem, has hit the nail right ton the head.  But something still doesn’t match up.  Readers seem to readily like and romanticize Gatsby’s tragic character (myself included), but have more trouble attributing the same connection and concern to the Unnamed Narrator (myself included). 

Perhaps this is due to the second-person narrative and its Brechtian effect.  Perhaps it is due to the lack of particular details that would otherwise enhance the story and orient the reader.  But I still think that the idea of How to Get Filthy Rich in Rising Asia as “a globalized version of The Great Gatsby” is a fascinating one.  Perhaps this is just the grittier, bloodier backstory that is the natural projection of the initial story combined with the modern cultural climate.  Could “Gatsby” just as easily be substituted for “you” and the story still ring true?

Steal Like an Artist (With a Side of Digitus Impudicus)

“Immature poets imitate; mature poets steal; bad poets deface what they take, and good poets make it into something better, or at least something different.  The good poet welds his theft into a whole of feeling which is unique, utterly different from that from which it was torn.” -T.S. Eliot

“Nothing you can write is original.”

This is certainly a bold statement, and even more so if you are a creative writing professor.  Of course, truth is stranger than fiction and this is the sentiment that greeted me on my very first day of college at the beginning of my very first class.  It would be accurate to say I was more than slightly taken aback.  There I was, a bright-eyed, bushy-tailed, and –admittedly– extremely impressionable freshman, and this was the very first thing a college class ever taught me.

Honestly, I would like to say that I shook it off, gave that sentiment the old double gun salute (if you know what I mean), and continued to be my plucky creative self.  However, it really stuck in my craw.  Over and over, our professor would tell us that he would think up a great line (for an article or the horror novel he was working on) and then plug it into Google only to find that it had all been said before.  We ended up reading a fairly typical swathe of works for a creative writing class which means that they were — comparatively — not typical at all.  The assigned readings kept getting stranger and stranger, until we reached the crowning work Marsupial: Our Mother For the Time Being by Derek White.  This novel definitely pushed the boundaries of what I was used to. No, actually, it stomped the boundaries to bits and then set fire to the rubble with its bizarre, meta-textual movements and its baffling ending.  The single review on Amazon pretty much describes my experience with this text: “It was in good condition when I received it. No pen marks, the cover was not beaten up. The book in and of itself was weird.”

So, what I was getting from this class was that nothing you can write is original — unless it’s weird as heck.

Looking back, this class had more of an impact on me that I originally thought.  With the risk of sounding trite, I’ll repeat the oft expounded catchphrase that college is the time to find yourself.  Moving from 15 years of conservative private Catholic school education to a liberal public university came with a certain amount of culture shock.  And there I was already a bit adrift a state away from home, in a school of roughly 20,000 students, and the first thing I learned was that nothing I would produce in the next four years or after would have any impact because nothing you can write is original.

It was a little tough to swallow.

On a certain level, I can maybe see where this professor is coming from.  For example, it is really difficult to find an original argument about Shakespeare because it has probably, as they say it, been done before, and done better to boot.  This I have no problem admitting.  There are unquestionably scholars who can write a considerably finer essay on Shakespeare than me.

But on a broader level? The idea that I can contribute nothing to the literary world because it has all “been done before” is incredibly irking to me now, and I’m more inspired than ever to present that professor with the digitus impudicus.  Austin Kleon says it the best in his book Steal Like an Artist: 10 Things Nobody Told You About Being Creative: “What a good artist understands is that nothing comes from nowhere. All creative work builds on what came before. Nothing is completely original…If we’re free from the burden of trying to be completely original, we can stop trying to make something about nothing, and we can embrace influence instead of running away from it.”

As I’ve developed as a writer, I’ve come to embrace this idea more and more.  How many creative works out there that people enjoy are actually based on something else? We couldn’t have The Lion King or Rosencrantz and Guildenstern Are Dead without Hamlet.  We couldn’t have Paradise Lost or Samson Agonistes without the Bible.  We couldn’t have Avatar without Pocahontas.  Heck, if you want to go there, we couldn’t have 50 Shades of Grey without Twilight.

I wish I could pass on to my younger freshman self the words of French writer Andre Gide: “Everything that needs to be said has already been said. But since no one was listening, everything must be said again.”

Apple Pie + George Washington + The Louisiana Purchase + Elvis + The Moon Landing = America?

Summer, 2004.

It was unbearably hot, we were unbearably cranky, so my mom loaded us all up in Big Brown and away we trundled to our Aunt Nancy’s house.  She, my uncle, and my cousins lived in Toledo’s Old West End —  a delightful trip back in time.  On the way to their house, you pass the hotel where Al Capone stayed when he was in town on…business.  You drive down Collingwood Avenue — a narrow street lined with tall, tall trees that used to be the central promenade of this Victorian neighborhood.  The interwoven branches high above you created a tunnel that vaulted you back in time. 

Collingwood is, oddly enough, covered in churches.  The Greek columns of a Christian church abut the more staid facade of a Seventh-Day Adventist temple.  My cousins lived on a street that boasted a modest one church that presided across the street and three lots down.  The battered houses there had the same draw for me as ancient ruins — equal parts crumbling and stately, but still beautiful. 

As a kid, I couldn’t quite comprehend this particular neighborhood.  On one side, there was the rough neighborhood around Bancroft street with its huddled houses and huddled people.  On the other side, through the tunnel of Collingwood, was downtown Toledo with its hard-edged skyscrapers, solemn public offices, and sprawling baseball stadium.  In between sat the Old West End. 

As a kid, I couldn’t quite comprehend this particular house.  My Aunt Nancy’s was a classic Victorian with indigo paint and a wide, sturdy front porch.  During the winter, when night fell on the snow that smoothed over the modern edges of the neighborhood and electric candles graced the windows, you would almost expect to hear the clack of hooves as a sleigh rounded the corner bearing a fur-swaddled and be-muffed family ready for bed.  The house came equipped with the two standard staircases — grand and servants’.  It still retained the segregated calling rooms — a ladies’ parlor and a gentlemen’s smoking room.  There was even a two-story stable in the back that had been converted into a two-car garage.  The unfinished basement showed signs of the original root cellar.  My cousins told me that a house down the street had a ballroom where the attic should have been. 

I had to process a lot of dissonance caused by what I saw, and in more ways than one.  One the one hand, this was nothing like what I was used to.  Where this house had darkly glistening wood floors and high plaster ceilings, my house had carpet and ceilings low enough that, when thrown, a bouncy ball would ricochet back and forth between floor and ceiling upwards of five times.  On the other hand, the clash of modern and historic made for an interesting tableau.  An exquisite stained glass rose embellished one ground floor window while on the adjacent sill squatted an air conditioner unit. 

However, that air conditioning unit was like a single ice cube in a cup of tea.  We were boiling.  So we made the trip 20 minutes across town and a hundred years back in time to that house on Parkwood because it had the holiest of middle-class grails next door: the above-ground pool.  Oh yes.  Today was not the day for making a slip-n-slide out of a tarp weighted down by bricks in the corners (woe be him who did not mindeth the cinderblocks).  Today was the day for luxuriating in all 52 sweet inches of cool, refreshing, chlorinated-as-all-hell water.  Because it was deeply summer and exponentially hot, there was not way we were getting out of that water until we were falling asleep in our floaties and wrinkled as prunes. 

We dove for rings.  We made a whirlpool.  We had splash fights.  We played King of the Raft.  We played Marco Polo.  We played Guess That Tune — but underwater.  We jumped, dove, belly-flopped in and rated the performances one to ten like it was the olympics.  Somewhere in there, my cousin Tom decided to show off his best “matrix moves.”  This involved falling backwards into the water whilst pinwheeling the arms in slow motion.  One by one, we all tumbled slowly back into the water.  Even me.  Even though I had no idea who or what “matrix.” was.

E.D. Hirsch would be ashamed of me. 

The Matrix is, of course, a 1999 blockbuster sci-fi film known for its slow-motion action sequences and the iconic scene image of Keanu Reaves’s Neo bending impossibly backward in order to avoid a spray of bullets.  It had certainly made its mark not only in the cinematic realm but in the cultural realm to the point where even a kid who had never seen the movie knew that it involved slow-mo stunts.  Thus, it would certainly be included on an expanded/updated edition of Hirsch’s original list of Things the Culturally Literate Know. 

However, part of me still balks at the idea of a culture being quantifiable.  It seems so like Hirsch is creating some sort of wizard formula that goes something like: apple pie + George Washington + The Louisiana Purchase + Elvis + the moon landing = America.  Or rather, knowing about apple pie, George Washington, The Louisiana Purchase, and the moon landing makes you an American.  My gut instinct is that culture should not be so easily definable, but on the other hand, how else do we describe it?  When you talk about going to visit a foreign country to get a taste of the foreign culture, you mean sampling their food, art, architecture, geography, and architecture.  In the same way that apple pie + George Washington + The Louisiana Purchase + Elvis + the moon landing = America, haggis + sheep + bag pipes + Edinburgh Castle + William Wallace + Nessie = Scotland.  Again, I am torn between the notion of recognizing these things that are undoubtedly Scottish and saying that they define what Scotland is.  It just seems too presumptuous to reduce an entire country/culture/people to a meager list. 

And this list is only growing.  If it helps, let’s put it in the modern terminology of data.  Back in the Victorian era, the average person did not generate that much data.  If they were literate, they probably wrote a few letters or kept a small journal, but that was probably it.  If they had their picture taken or sat for a portrait, it definitely did not happen that often — maybe only twice in their lifetime.  Now think about modern times.  One college student could produce as much writing in one semester for one class as another person produced in their entire life.  One college student could take eight selfies in one trip to the bathroom instead of having to sit eight weeks for a portrait.  There is so. much. data. 

I’m not sure Hirsch saw the internet coming.

How can one possibly be expected to observe, process, and store so much information? 

In his New York Times opinion piece, Karl Taro Greenfield describes the cultural literacy version of faking it.  Faking Cultural Literacy is Greenfield describing this exact problem.  He talks about scrolling through Facebook but never clicking on any of the shared links, discussing a scientist’s paper even though he’s never read the paper, and forming opinions on books he’s never read.  He discusses that even though you don’t necessarily care about the trending tags on twitter, you’re expected to not only know about them but have an opinion about them.  As Greenfield puts it, to know all is almost impossible and to somehow not know all is to fall out of touch:

The information is everywhere, a constant feed in our hands, in our pockets, on our desktops, our cars, even in the cloud. The data stream can’t be shut off. It pours into our lives a rising tide of words, facts, jokes, GIFs, gossip and commentary that threatens to drown us.

For me, this sort of faking it is most expressed in the realm of geekiness.  Like NHL aficionado-hopefuls might bluster their way through a conversation with a hardcore puckhead by mumbling about Gretzky, Lemieux, and Crosby — and how much the Oilers are struggling — I can talk to diehard Whovians with a certain amount of confidence about weeping angels and fish fingers with custard even though I’ve only seen exactly one episode of Doctor Who.  The same goes for Supernatural fans (I just mumble something about Destiel), Attack on Titan fans (just hum the theme song), and Big Hero 6 fans (gush about how cute Baymax is and you’ll be fine) even though I’ve only seen fractions of each source material.  I call this phenomena “nerd osmosis” — where you just kind of absorb the information of the general group around you, because there is no way you could possibly know every single fandom that is represented on Tumblr or at Comic-Con. 

So to E.D. Hirsch I would say: I don’t think Cultural Literacy exists. At least, not in the way he pictured it.  I don’t know if the idea of being culturally literate has changed or the culture to which we are literate has changed, but there seems to exist a disconnect nevertheless. 

Part of this is just me playing devil’s advocate.  Because, as my friends joke, being a Reamer means communicating almost exclusively in M*A*S*H, Psych,  School of Rock, and Disney movie quotes — with a good bit of misappropriated song lyrics thrown in.  So how do I reconcile  the part of me that instinctively rejects E.D. Hirsch’s assertions on the basis that they seem pretentious with the part of me that can recognize references to Romeo and Juliet and connect the suffix in Gaza-gate to the original Watergate scandal? 

I would offer an amendment to E.D. Hirsch and tell him that the loss of culture is natural.  Victorians, who produced a very finite amount of information and had a very finite idea of what an education or educated person looked like, would have a much easier time of mastering the culture.  Nowadays, the idea of cultured person is much harder to define because the culture is harder to define and quantify.  This is the information age but there is only so much information a person can reasonably be expected to remember.  And each new generation has it worse than the one before because they have even more details to deal with.  The globalization of culture via the internet and the increasing ease of international culture also further blends the idea of American culture with other cultures.  As Greenfield puts it:

So here we are, desperately paddling, making observations about pop culture memes, because to admit that we’ve fallen behind, that we don’t know what anyone is talking about, that we have nothing to say about each passing blip on the screen, is to be dead.

The Matrix and Erupting Fountains…

Never trust the narrator.

This is the chorused maxim of literature and english classes starting in about junior high or early high school.  It’s at this age that teachers start baby-stepping their students out of the literal, superficial meanings of written works and start constructing shrewder readers.  A certain amount of this analytical training deals with an author’s imagery, syntax, and diction choices, but that all stems from questioning the obvious (I’ve also heard this called “letting the obvious amaze you” or “problematizing the text.”).  By looking at the author’s image of a bird taking flight, their choppy sentences, and their repeated use of the color blue, a reader can develop their own reading/interpretation of the text that is potentially deeper and/or more meaningful (although it is wise to take them with a grain of salt).

Those forms/angles/strategies of textual exploration are all well and good, but they are all based on what the author chooses to give the reader and sometimes the most important part of a story are the bits the author pointedly leaves out.  For example, in 7th grade, my little sister faced a daunting task: the final creative writing assignment.  Being the sci-fi/fantasy enthusiast that she is, she decided to write a story about a gallant knight who storms the castle in order to rescue the kidnapped royalty.  However, her fledgling novelist skills were not quite up to the task of describing medieval combat in all its clanking, gory glory.  When I found her hunched over the kitchen table in frustration, I reminded her of the power of the unknown — that sometimes it’s okay to let reader’s mind fill in the gaps.  Thus, when the Noble Kate finally burst into Crown Prince Phillip’s cell and he asked her how she could have possibly defeated all those evil vassals, Kate simply responded, “Never underestimate chunky peanut butter.”

Here, the author (my sister) has chosen to let the reader figure out how PB (and maybe J — who knows!) factored in to Kate’s dashing rescue plan because — as moms and torturers know — the most powerful mystery is the mystery of the unknown.  Authors strategically let readers draw their own conclusions.  Sometimes for reader engagement, but other times because what they have to say is not, how do I say this, polite.  Such authors are the ones who discuss “erupting fountains” or just trail off with a “dot dot dot” like in Mamma Mia.

I apparently haven’t grown into my big-reader pants yet because I never seem to pick up on this particular brand of subtleties.  A fountain is just a fountain to me.  It seems I’ve never gotten into the habit of distrusting the narrator.

Orhan Pamuk’s My Name is Red offers a different spin on the “Never trust the narrator” proverb.  His chapters are each narrated by a different character or inanimate object.  For example, the chapter title for chapter four is literally labelled “I will be called a murderer.”  No secrets there. Do not trust that guy. He’s a murderer.  Even I understood that.  And yet, the reader still doesn’t know who the murderer is until the very end of the book.

Somehow, Pamuk reveals everything but nothing.

The reader knows how Elegant Effendi was killed, why Elegant Effendi was killed, where Elegant Effendi was killed, but not who killed him.  His multiple narrators all at once explain and confuse.  By providing multiple narrators, Pamuk offers the reader — in a murder mystery sense — more data and evidence.  This is generally a good thing for any investigator, but also problematic because it is filtered through a different lens for each informant, who each has their own goals and motivations.  In fact, Shekure literally admits that she distorted certain facts in order to deceive the reader/audience.

Thus, Pamuk’s My Name is Red offers a peculiar contradiction to its readers that really complicates the matter of narrator reliability.  I think a reader naturally wants to sympathize with or relate to the narrator, and Pamuk’s characters reinforce this by beseeching the reader to understand their point of view.  However, they then actively admit that they are concealing things from the reader just as they conceal things from each other.  The characters only ever have one piece of the puzzle, the reader has them all, and yet still the ending was a surprise.

 

So now it all appears to boil down to “never trust the narrator” yet again — although My Name is Red certainly demonstrates that there is more to that concept than meets the eye.  And this is something that authors and generators of creative works seem to like to explore.  Indeed, Pamuk is just adding to the canon of unreliable narrators.  Shekure will join the ranks of Neo in The Matrix, Teddy Daniels in Shutter Island, and Dr. Sheppard in The Murder of Roger Ackroyd.


Is it just a gimmick, or is it something more?  Maybe Pamuk — and other authors, creators, and teachers — are trying to get us to pull a Neo and question the obvious.

My Name is Red: Coming Soon to a Theater Near You

There’s something I’d like to get off my chest.

I have a chronic condition.  It’s incurable and affects my daily life.  I caught it when I was little and haven’t been able to shake it since.

I’m a crafter.

It’s truly an illness — a crazy, creative sickness.  I usually have at least two projects going at the same same, with three more waiting in the wings and even more just percolating in my head.  I cannot be trusted alone in Jo-ann’s or Michaels. 

yarn meme size of your stash

There is a dedicated table at home called the Craft Table.  I have a deep, deep love and appreciation for pipe cleaners. 

liscense to carry my 9mm

My fingers have occasionally been super glued together.  I once thought making macaroni snowflake ornaments would be a good idea.  My most recent projects were making/assembling costumes for my little brothers for an anime convention. 

That being said, I often like to listen to audiobooks while I’m working on something.  I get into a sort of zen-like state where my hands are just running on autopilot and I am fully immersed in the storyteller’s tale. 

whoops I made a scarf

Whole scarves have been knitted and afghans crocheted in such a way. My Name is Red was no different.  I listened to that 20+ hour beauty and knocked out about 2+ feet of blanket.  And it was while listening to John Lee narrate in a Zen-and-the-Art-of-Tunisian-Crochet attitude that I realized the inherent performability of Orhan Pamuk’s tale. 

The more book I listened to the more I warmed up to the idea.  Enishte Effendi, Olive, Esther, Master Osman, the corpse, the tree, the coin — John Lee brought them faithfully to life.  His performance was certainly a factor, but also the style of the prose lends itself towards a performance.  All the characters address the reader personally, and I see this as an equivalent to the Shakespearean aside to the audience.  In addition, there is the character of the storyteller himself imbedded in the narrative, and how he isn’t fully revealed until the end of the novel.  I can practically see it staged: the main stage goes dark and the lights raise up on a small side stage, revealing the storyteller in the coffeehouse relating the travels of the counterfeit coin.  Or perhaps the director would go full Cloud Atlas and have each of the main characters also perform the monologue of one of the non-human characters in order to preserve the storyteller’s identity as it was preserved in the book.  Furthermore, the tantalizing elements of a murder mystery are sure to drive the plot and keep the audience interested: “Who’s nostrils are these? Did Olive, Butterfly, or Stork brutally bludgeon Elegant Effendi? Stick around for Act III and find out!” *cue enthusiastic silent film score piano player*

The scenes Pamuk illustrates are extremely suitable for a stage set adaptation.  For Black and Shekure’s domestic dramas: a cross section of a traditional Turkish two-story house that allows the audience to see all at once Black’s visits to his Enishte in one room and Shekure’s eavesdropping in the other room.  The coffeehouse could be reminiscent of the ABC Café in Les Miserables, but with a Turkish flair and mugs of liquid instead of bottles. 

ABC cafe

For the royal palace treasure room: a cross between the warehouse where the Ark of the Covenant was filed away, the Room of Requirement, and Smaug’s hoard (but with a less gloomy color scheme).

ark of the covenant warehouse

the room of requirement

the treasure of erebor

Pamuk has provided such a rich visual and cultural background for this novel, and I think that it could comfortably and faithfully translated to the stage or screen. 

Whichever medium, it would undoubtedly be an artistic creation.  In fact, it would be an artistic creation based on an artistic creation about artistic creations — which would certainly add an interesting new dimension to the discussion, especially on the topic of signature and style.  

(And this song would play over the bows or credits)

Istanbul and Architecture: A Magnificent Mosaic of Life

Istanbul and architecture.001

“And the fleet of little boats moved off all at once, gliding across the lake, which was as smooth as glass. Everyone was silent, staring up at the great castle overhead. It towered over them as they sailed nearer and nearer to the cliff on which it stood.”

In literature, it’s commonly known that setting sets the mood.  If a novel opens on a dark, moonless night with a dilapidated old mansion in the foreground, surrounded by a misty moor with wolves howling in the background, it’s much more likely that Scooby and the Gang will be trundling up the lane with a flat tire to stumble in and solve a mystery than the Teletubbies. 

Orhan Pamuk’s My Name is Red is set against the intricate, vibrant cultural background of 16th century Constantinople (now called Istanbul).  Between the coffeehouses, the miniatures, the religion, and the cultural parables, this city in this particular moment in time is a lush backdrop for a murder mystery.  However, one of the most important elements is still missing — the literal background. 

While the miniaturists at this time may not have believed in the value of perspective, there’s no denying that their city had it.  Between the mosques, schools, inns, mausoleums, hospitals, bridges, castles and palaces, Istanbul had and has a very striking skyline. 

Istanbul and architecture.002

The Istanbul skyline

The architecture in Istanbul is heavily influenced by Ottoman architecture, so that will be the focus of my research and presentation.  This style of architecture first emerged in the in the 14th and 15th centuries in the cities of Edirne and Bursa.  It grew out the earlier Seljuk Turk style of architecture.  In addition, there were the added influences of Byzantine, Persian, and Islamic Mamluk architectural traditions due to the conquest of Constantinople in 1453.  For example, the Hagia Sophia — the flagship example of Byzantine architecture — served as an inspirational model for Ottoman architects when they were planning the creation of mosques. 

Istanbul and architecture.003

Internal and external views of the Hagia Sophia

There are several different phases in the era of Ottoman architecture: the Early/First Ottoman period (1300-1453), the Bursa Period (1299-1437), the Classical period (1437-1703), the Tulip period (1703-1757), the Baroque period (1757- 1808), the Empire period (1808-1876), and the Late period (1876-1922). 

Ottoman architects, especially after the formative period of the style, primarily used stone.  In fact, one of the hallmarks of Ottoman architecture is its exquisite masonry.  Overall, stone was most popular material, but brick was also used for some of the arches, domes, and vaults.  The architects used lead to cover the domes and minaret caps.  Internal wall and dome coverings were often comprised of polychrome glazed ceramic tiles, especially the famed Iznik tiles with their strong blues and whites.  This technique would eventually become more common than using marble.  Wood was also used as both a decorative and structural material, and it was the chief component of houses in Constantinople. 

Istanbul and architecture.004

An example of the arches and columns typical of Ottoman architects

Istanbul and architecture.007

Example of Ottoman tile wall and ceiling coverings

Istanbul and architecture.006

Traditional wooden Ottoman house

From the very beginning of the era, one of the distinctive features of Ottoman mosques is a large dome that dominates the majority of the prayer hall.  As the architects perfected their techniques, thus dawned the classical era of Ottoman architecture, and the size of the central dome increased.  Often, there would even be several half-domes or smaller domes surrounding the main dome.  Generally, these domes — large and small — have a semi-circle flat profile. 

Istanbul and architecture.008

An example of an Ottoman Mosque

Istanbul and architecture.009

An example of an Ottoman Mosque 

Istanbul and architecture.010

An example of an Ottoman Mosque

Istanbul and architecture.011

An example of an Ottoman Mosque 

Istanbul and architecture.012

An example of an Ottoman Mosque

 Another trademark Ottoman mosque feature is a particular type of minaret.  These minarets are slender, ridged, and with an extended, lead-covered conical cap.  Typically, they will have one to three balconies.  Finding a minaret such as this on a mosque is a clear indication of its Ottoman origins.  The number of minarets could also indicated the mosque’s status — mosques with royal patronage could have anywhere from two to six minarets.

Istanbul and architecture.013

Here is a diagram of the different types of minarets. Notice how all the mosques above have this distinctive minaret style.

Due to the influence of imperial patronage, Ottoman architects became particularly focused on the creation of kulliye.  A kulliye is the network of buildings centered around a mosque.  These were prioritized because they combined religious, financial, funerary, culinary, cleansing and educational institutions. 

Istanbul and architecture.005

An example of a kulliye complex

Thus, it is very common to see the hallmarks of Ottoman architecture in other buildings and structures.  For instance, Ottoman influence can be seen in schools, tombs, inns, hospitals, bridges, castles, and palaces. 

Istanbul and architecture.014

The Selimiye Mosque

Istanbul and architecture.015

A panoramic view of the courtyard of the Caferağa Medresseh — a madrasa (school)

Istanbul and architecture.016

The Yeşil Türbe (a mausoleum)

Istanbul and architecture.017

The Buyuk Han — a caravanserai (roadside inn)

Istanbul and architecture.018

An external view of the Beyazid II Külliye — a darüşşifa (hospital)

Istanbul and architecture.019

A view of the internal courtyard of the Beyazid II Külliye

Istanbul and architecture.020

The Stari Most bridge

Istanbul and architecture.021

The Topkapı Palace

Istanbul and architecture.022

Rumelian Castle

Building production peaked in the 16th century, by which time Ottoman architects had perfected the seemingly-impossible technique of creating large indoor spaces that still someone supported incredibly massive domes.  They also had refined the combinations of inner and outer spaces, as well as articulated chiaroscuro in order to reach a balanced harmony.  Ottoman architects incorporated columns, domes, squared dome plans, slim corner minarets, and vaults into the plans for their grand mosques and other structures — which they managed to bring to life at the ideal intersection between aesthetic quality and technical brilliance.  All together, these unique buildings combine like the Ottoman tiles to form a magnificent mosaic of life.

Istanbul and architecture.001

An aerial shot of Istanbul

If you are interested in further reading or information, check out the links below:

Boundless.com: Ottoman Empire

The Metropolitan Museum of Art: The Art of the Ottomans before 1600

TheOttomans.org: Architecture

Museum with No Frontiers – DiscoverIslamicArt.org: The Ottomans

The Architects Apprentice: A Novel by Elif Shafak

Dear Humanists: Quit whining, analyzing, and panicking — and get a PR agent instead

Ask a random joe off the street to define the humanities and they probably won’t be able to give you an answer.  Ask a college student studying the humanities to give you a definition of the humanities and they’ll probably just mumble something before shuffling off in an under-caffeinated daze. 

Although stated in a glib manner, these scenarios are not outside of the realm of possibility and are, in fact, probably closer to approaching the norm.  There is a crisis in the humanities.  And that problem has to do with public perception. But not in the ways you’d expect.

First, there is the issue of what is perceived as the fatal crisis of the humanities  — student enrollment in the humanities is falling and the job market is drying up for those who graduate with a degree in the humanities.  In his guest column in The Chronicle of Higher Education, Scott Sprenger handily debunks these extremely rampant assumptions.  Sprenger demonstrates that the issues that have humanists squawking in fear are actually not true: there is no sudden and/or unexpected drop in enrollments or the job markets (and I’ve included the link incase you would like to take a closer look).  So what is the real crisis in the humanities?

The crisis, it would seem, is one (at least in part) of public perception.  In this blog post, I hope to trace some of the existing thought on this issue and offer up my own two cents.

To begin, there is a problem with the public perception of the purposes of colleges and universities.  In the Sprenger column mentioned previously, Sprenger brings up the statistic that over 90% of modern college students said that they chose their major because it will lead to a job after college.  As a member of the millennial generation, I can wholeheartedly agree that this describes the general spirit and motivations of my peers.  While this is certainly a necessary factor to consider, this new spin on the college education is potentially alienating and demeaning for students who find an existential pleasure in the humanities, as Dan Edelstein points out in his article The humanities are an existentialism.  So, in this sense, the humanities departments on college campuses are facing a crisis of identity — how will they define themselves and therefore their students?

In addition, Gary Gutting points out in his opinion piece The Real Humanities Crisis that the real crisis in the humanities stems from a society that devalues them.  Unfortunately, this seems like a symptom of the anti-intellectual culture.  Gutting’s proposed solution is a rearrangement of contemporary schemas of worth until universities are as important and as supported as NFL teams. 

On the other hand, James Mulholland argues in his piece It’s Time to Stop Mourning the Humanities that change is inevitable and that humanists should take advantage of the system before they become victims to it.  To quote Corny Collins from the musical Hairspray, “Velma, isn’t this where it’s all heading anyway? Now you can fight it, or you can rock out to it!”  Mulholland very bluntly states that, “We are being forced to sell out to corporate models of higher education. Let’s at least be sure to sell high.” 

Perhaps the middle ground between these two ideas comes from Vimal Patel in the piece How to Make the Case for Graduate Education. Patel argues that lawmakers and the runners of bureaucracy are willing to support the humanities but they don’t because they don’t understand their value.  Thus, Patel proposes that humanists liaise with the current system and meet somewhere in the middle as opposed to a complete overhaul of the system or complete capitulation to the system. 

Finally, it is relevant and worth noting that the public perception of consumption of information and literature has changed.  As Kelly J. Baker points out in her column A Rallying Cry for the Humanities, there is no denying the popularity of modern social media sites such as Twitter and Facebook.  Baker proposes that, instead of deepening the chasm between  pop literature and academic literature, humanists should instead use the tools of pop culture to encourage an updated, reinvigorated interest in the humanities a la Neil DeGrasse Tyson and the wonders of the Cosmos.

So there you have it. The real “crisis” of the humanities.  As you can see, this evidence points to a problem within the realm of the humanities and in terms of how the humanities position themselves within the grand scheme of things. 

As for my own two cents on the issue, I alway consider myself an optimistic realist — I hope for the best, but I also know the score.  In that vein, I don’t know how I feel about Gutting’s plan to redefine an entire culture.  Perhaps that’s a good end goal, but for me that seems too high a mark to reach.  Sports have been popular for thousands of years and they will continue to be so for the foreseeable future, and I don’t know if the humanities can ever reach that height of fame — but, darn it, we can at least lessen the gap. 

In reference to Mullholland’s idea to manipulate the corporate college scheme, I’d say I am also hesitant to completely agree with this idea.  It certainly is refreshing to have a reaction to the changing climate of academia that isn’t just wailing and gnashing of teeth.  Mullholland is, to put it lightly, frank about the problems humanities departments face today, but the optimist in me likes to think that the situation isn’t beyond saving.

Which brings us to Patel and Baker.  I think they both have the right idea.  I don’t think the answer is to completely change the system or to completely yield to the system.  Instead, I think the key is to meet somewhere in the middle. 

Us millennials seem to have forgotten the idea of compromise and the power of the vote.  Indeed, congress has been in a partisan deadlock for about half of our lives.  All the posturing and petty fighting between parties in the very center of our government sends a very clear message to us younger citizens — that our government is slow-moving and our voices won’t be heard.  So I think Patel and Baker bring up very good points.  We as a younger, up-and-coming generation need to redefine the spirit of possibility that seems to have been lost before we become stuck in complacency. 

Now I know that sounds as grandiose as some of the other solutions I mentioned previously, but, again, I’m an optimistic realist.  So that’s where the baby steps come in.  In this case, it begins with literal baby steps.  One of the things I feel very strongly about is that parents should read to their children.  It’s positive effects are innumerable and it can only help to create an appreciation for the humanities from a very early age.

In addition, I agree with Baker when it comes to social media.  It’s time to take to the Facebook walls and message boards of the internet.  Perhaps something as simple as a hashtag (#HumanitiesMatter, maybe) or as elaborate as an online movement.  And it’s impossible for critics to scoff at these cites anymore when they have literally facilitated revolutions in other countries.  In addition, the internet facilitates civic engagement.  We the People is a website that solely houses petitions that are up for support or response — and all you need to sign them is a computer.  Here you can vote for the creation of our very own Death Star spaceship (yes, that actually happened) or perhaps weigh in on a topic that hits a little closer to home, like national healthcare.  There are also websites that host form letters in support or opposition of a particular issue.  All you have to do is copy and paste it into an email to your representative — in this case, making a difference is as easy as Ctrl+C, Ctrl+V. 

It’s time for a wake-up call to remind the younger population that they can make a difference.  It’s time for the humanities to hire a PR agent.  And that PR agent could just be a random joe off the street. 

Writing, “Gender,” and the Difference Words Make

Henry Louis Gates certainly sails one of the flagships in the literary canon wars.  His treatise on multiculturalism — Writing, “Race,” and the Difference It Makes — discusses the historic “dead, white, male” literary canon and the struggles minorities face when it comes to making their written voices heard.  Throughout the paper, Gates outlines how difficult it was and is for authors to be taken seriously in the world of academia if they do not come from and play into the established, dominating authorship conventions.  In addition, Gates also points out a troubling conundrum facing multicultural authors.  He points out that, in order to be taken seriously by capital-A Academia, an author often has to concede to the accepted etiquette of that body of scholarship — which has its positives and its negatives. 

Gates makes his case primarily in terms of race, but his arguments could easily be transplanted into the realm of gender representation.  Of course, on the surface, there is the problematic representation of women and female authors — or lack thereof — in much of the prominent literary canon.  However, there also exists an issue at the level of the language itself. 

Any linguist will readily agree with the statement that gender exists in language.  Spanish has it. So does German. Gender, in this context, refers to different categories of words and their subsequent patterns of conjugations.  Thus, gender in language is just another way to label and differentiate groups of words — groups that could just as easily be called something else.  Therefore, any linguist will also readily agree that although gender exists in language, there are no correlations with “gendered” words to biological sex.

This is true and it is also not true.  While there is nothing about la mesa that makes it inherently feminine, there are words that do indicate gender in a different way.  It seems like stating the obvious, but English pronouns are obviously genderized. For instance, take the sentence “He gives the book to her.”  This creates a very clear picture in the reader’s mind: a man is giving a book to a woman.  Again, this may seem like stating the obvious, but this is an accepted convention of standard English.  Thus, authors who don’t subscribe to these norms have had and will have difficulties getting their words to see the light of day.

German has gender. But German also has a third gender: neuter, or neutral. 

This is an important step in the re-codifying of the canon at the level of the language itself.  For those authors in the LGBTQ community especially, it will be of great benefit.  For these authors, the current protocols of standard English don’t always support what they want to express.  For a person who is transitioning or a person who is intersex, the pronouns he, she, and the like don’t necessarily apply.  How can they accurately relate their stories and be represented in the literary canon if the language itself is built in such a way that they, on a certain level, don’t exist? 

While Sweden successfully adopted a gender-neutral category in their language in 2013 just for this reason, English still struggles to find a set of words that works.  Numerous different ideas have been batted around in the U.S. alone — from ze to yo — but none have stuck.  If language is important as Henry Louis Gates bids us to believe, then that is the next wall that has to be broken down in order to build a more comprehensive canon.   

E.D. Hirsch – Outdated or REALLY Outdated?

E.D. Hirsch, Jr. has a lot to say about cultural literacy.

238 pages’ worth, to be exact, because that is the length of his seminal novel Cultural Literacy: What Every American Needs to Know.  In this literary/sociological/economic/anthropological treatise, Hirsch quickly establishes the idea that communities are held together by the bounds of common knowledge and thus that the current up-and-coming American generation is at a disadvantage due to their seeming lack of proficiency in this area. 

Hirsch does not just stamp his metaphorical feet, expound some damning observation, and then march off in a huff — he also jabs a very pointed, comprehensive finger at the American education system.  The American Education system is, of course, a vast and unwieldy thing that could be attacked from many different angles, and Hirsch does so with glee.  In particular, one of the facets he focuses in on is the underlying philosophies around which the entire system has been structured.  Hirsch looks to several eminent philosophers for a succinct summary of the ideologies behind education.  Plato postulated that children should be privy to all culture — even adult culture — because he assumed that philosophy could refine culture to its ideal.  Jean Jacques Rousseau conversely argued that adult culture is “unnatural” to children. 

Hirsch’s brings up an important and newly-examined complication in the areas of children’s education and literature.  This issue can be encapsulated with one question: what is children’s literature?  The answer seems obvious — works written for a younger audience’s enjoyment.  For years, that is where the majority of prevailing thought stopped.  However, the recent trends in feminism and women’s and gender studies have started to question this simplistic view of children’s literature.  Scholars in these fields point out that it is not children who define what children read, but adults. Children end up reading it, but it is written, illustrated, edited, and published by a string of full grown people. 

And everyone seems to have an idea about what this amorphous “child” should or should not read.  How many times has a book been protested or banned in a school district because it was deemed unsuitable for children?  But who makes these decisions?  Adults.  They wield this idea of an ideal child and what might affect them, even though anthropological data indicates that children are remarkable adaptable and can process a lot more than the general populace gives them credit for (up to a point, of course). 

So how does this relate to our friend Mr. Hirsch?  His entire premise is that the American education system needs to be reformed in a pretty drastic way.  However, he says this from the position of the educator, not the educated.  I’m not inclined to disagree with Hirsch’s assertions and proposed solutions because he makes an interesting and well-reasoned argument, but I am questioning the fact that he makes these assertions and proposals from the view of the educator and not the educated.  Hirsch seems to be subscribing to this philosophy of the “ideal” child and again using it as an evidence to further justify his proposition.  Thus, I would just caution readers of Hirsch’s work that his assertions are not necessarily as holistic as they could possibly be. Where does Hirsch step back from his “ideal” child and actually ask the younger generation what they want to learn or how they want to learn it?