Thursday, October 27, 2011

The Internet Will Not Save You

I spend a lot of time and space on this blog defending the Internet as a crowd-sourced, masterful network which can hold and make available all the information we as consumers, citizens, and people would ever want to know. But if I've given the impression that the Internet is solely a revolutionary land of gumdrops and kitten videos, you'll have to forgive me. As I've said before, the Internet is merely a tool to make your life easier. Sure, it allows us to see the evolution of ideas in real-time, and its spread to autocratic dictatorships have proven information to be the most important weapon a people can have. But outside of philosophers and revolutionaries, it is merely a merger between the people who brought you Three's Company and your phone company. The Internet is not your existential savior. The Internet is not your friend. It will not provide meaning to your dull, repetitive life.

In Fahrenheit 451, Ray Bradbury asks us to imagine life without meaningful literature. Characters in the novel are lulled into a catatonic state of interaction when they turn on their "walls" to join their "families", a thin approximation of televisions and sitcoms. The Internet is fulfilling this role in a unique twist Bradbury could not have seen coming. Rather than letting our mental states wallow in the Brownian motion of bad fiction, the Internet has been turned inwards. Instead of analyzing the social circles of JR and Sue Ellen, we spend hours a day pouring over our own friendships and connections. Take this graph from Nielsen, those who watch the watchers:

By nearly every measurement (the exception is bandwidth usage, which Netflix dominated pre-Qwikster), social networking is the number one use of the Internet. Social networking can mean a lot of things; these studies merely track what websites users spend their time on, not what they do on those sites. So according to Nielsen, there's little difference between flirting with a classmate and planning Occupy Maine. But what does it say about the Internet, the world's largest free library, that most of its users are too busy ogling their ex's photo albums to take advantage of the historically significant liberation of information?

Arguments about the eventual downside of artificial intelligence seem to focus on computer's gaining independence outside of mankind. But even the highest forms of artificial intelligence need input from a human and output to another human to be of any real use. All computers are, essentially, telegraphs, receiving the message of one user and transmitting it for another. So it makes perfect sense that the Internet, the closest we have come to a "world brain", would be spent on the same banalities we use our phones for (and the mergers of those two worlds was only inevitable).

Facebook and, to an even greater extent, Twitter have earned quite the reputation in the Third World for allowing the easy and free spread of information. First recognized in 2009, Twitter's use as an activism tool gained a foothold during the Iranian protests against that country's most-certainly fraudulent elections. The Arab Spring of this year has also seen social networks put to use when organizing protests. Twitter was seen to have played such a large role, co-founder Biz Stone was in the running for this year's Nobel Peace prize (the real prize will come if he ever finds a way to make money off of Twitter).

That said, the only revolutions most Facebook users are igniting are are fake campaigns to end child abuse. This fairly recent phenomenon, known as "slacktivism", centers on pointless online efforts to "contribute" to social or political campaigns. No, changing your Twitter profile to a green background did not help Iranian protesters. However, I'm fairly certain most participants are aware of this. Signing an online petition (which, some people need to be told, is completely and utterly useless) is less a solid statement of activism than it is a sign of solidarity like, say, wearing a black armband to class to protest the Vietnam War.

Internet action, as I've said before, is not the same as real action, and it would appear most members of my generation know this. However, becoming too reliant on online tactics can make us forget what tactics really work. The Far Left has been trying to force economic justice as an issue for years, and while their online activities have been numerous and plentiful, the discussion wasn't changed until people began protesting in a real and noticeable manner. The Internet is great for the spread of information, but if that mindset closes when you leave the office chair, then it can all be for nothing.

Wednesday, October 26, 2011

Class

In 1984, my favorite love story, George Orwell's hyper-secure state of permanent warfare and delineated economic class is bottomed out by the "proles" (short for proletariat), an underclass deemed too stupid to worry about the evil machinations of "the Party." In this dystopia, the slogan "proles and animals are free"  dictates the government's attitude towards the poor; they, like animals, lack the decision-making skills to abuse their freedom in any meaningful way and therefore are not a worry of the Party. The novel's hero, Winston Smith, calls the proles the best hope for freedom. "Until they become conscious they will never rebel, and until after they have rebelled they cannot become conscious."

The idea of a clueless but easily influenced working class is not a new one. Like most of 1984, it is based on the ideology of 19th-century conflict theorists like Karl Marx and Max Weber. Decrying the liberalism of the bourgeoisie (middle-class), Marx stated " the proletariat alone is a really revolutionary class." Fear of the working class can also be seen even earlier, amongst American intellectuals in the wake of the 1776 revolution. John Adams and other founders were concerned about "mob rule" in a democracy (hence why Senators were chosen by state legislatures, not elected, until the passing of 17th Amendment), leading to a property requirement for voting. The reason for this is largely the view of the poor as uneducated and easily influenced. If the poor can be guided towards a worldview that congratulates their hard work and romanticizes their poverty, they can lead movements of massive social change, for better or for worse.

The last 40 years have done quite a bit to change this factor in the Western world. College, after the 1950's, was no longer the elite club of intellectuals and rich offspring. In a post-Belushi era, college is a right of passage in the United States. And as long as jobs and credit were flowing freely, this system could sustain itself.

However, as banks deny more and more loans and jobs remain stagnant, the "Lost Generation" of today is far from undereducated. They are, in fact, anchored by the debt they were told to garner in turn for an education. However, the tools to win in this system have failed to prove effective, but they also allow us to understand why we're poor, why our choices are so limited. The proles are no longer rural simpletons; they are a people raised on middle-class dreams, educated by the best universities they were willing to go into debt for, and thrown into a reality which has no need or room for the education they were given.

The phrase "we are the 99%" reveals two things about Occupy Wall Street as a class movement. First is its effort to identify not just as a majority a la Nixon's "Silent Majority", but as a movement which means to represent nearly everyone. It is an "us-versus-them" game, and unless you own a bank (or two), they are on your side. Second, OWS is chiefly a movement about class consciousness. The power corporations have over the middle and lower classes is astonishingly staunch and far older than Citizens United or the 2008 collapse. By breaking away from these outdated models of upper, middle, and lower class (to say nothing of lower-middle and upper-middle class), OWS is seeking to change the way we view class structures. It says to the middle class, "you are merely luckier than I am. We face the same forces in a world designed to benefit the ultra-rich."

In the context of how we define class and the American Dream, this is an astonishing message to build a movement around. Protest movements tend to need individual targets, like Mubarak or Obama or Lyndon Johnson. But to focus less on individuals and more on the very method we diagnose our economy, the very looking glass we peer into, makes the popularity of OWS very unique. OWS is expressing the worldview of most Americans, that our society is heavily bent towards old-money and the well-connected. It is a populist message, and public perception does not equal reality, but a new generation of economists and historians is being raised and educated in this social climate.

It is possible, perhaps even likely, that social commentators are slapping a sticker on OWS too soon. But even if Zucotti Park is emptied tomorrow, the measure of OWS' success will be the public debate. Sadly, the public debate is largely controlled by the attention-deficit media. Admittedly, it's hard to focus on the drum circles and smell of hemp when dictators are being thrown into the frozen food section. But the effect OWS has had, putting income inequality and financial justice on front pages around the world, was precisely its aim. They do not have specific legislative hopes or demand the resignation of any official (at least not collectively). This was a movement by the proles to educate the proles. OWS recognizes the limited effect they will have in the halls of Congress or on the trading floor. The real victory is educating the rest of us, the 99%, about our place.

Friday, October 21, 2011

Television

I just finished reading a 1994 Washington Post feature entitled "Group Portrait With Television", an anthropological review of a suburban family and their television habits. It's odd to remember that television, a dull stream of American consciousness compared to the netherworld of the Internet, was once the target of PSA meetings and parental organizations. The manner in which the article neatly details the exact schedules of the family's viewing habits, down to the size of the consoles (19 inches?! Slow down, America) and the rooms in which they watch, seems to be doing so for shock value. How could a family be okay with watching Maury Povitch every day? The father of the Delmars, the family which serves as the subject of the portrait, has an interesting take on it:
"I just don't buy it that too much TV is bad for you," says Steve, 37, the chief financial officer of a company that makes automated telephone answering systems, who gets home from work around 7, eats dinner while watching Dan Rather and Connie Chung, settles down in the den by the 19-inch Sony, watches a few hours of sports, goes back to the bedroom, turns on the Hitachi and falls asleep with it on because Bonnie can't fall asleep if it's off. "Nobody wants to admit they watch television -- it's got the connotation: 'the boob tube' -- but all these people, what are they doing? I'm not sure if they have any more intellect. It's not like they're all going to the Smithsonian or anything."
America before OJ. America before Monica. America before Osama.

Indeed, television was subject to more scrutiny by parental and medical organizations than any media source before it. As the Nielsen rating system revealed Americans to watch 7.5 hours of TV a week (the number is closer to 6.5 currently) and specialized channels (CNN, ESPN, Nickelodeon, The History Channel, The Food Network) began to dominate more bandwidth, television was king of the 1990's. Bill Clinton, in his 1992 campaign, made a point of supporting V-chips, a now archaic technology which attempted to allow parents to block certain kinds of programming. Parents in the article are horrified when a child health specialist holds a seminar at the local elementary school to espouse the dangers of too much television:
She turns on the TV and shows a videotape, in which the announcer says that "in a typical television season, the average adolescent views 14,000 instances of sexual contact or sexual innuendo in TV ads and programs."

She turns on an opaque projector and shows a chart that says: "Most children will see 8,000 murders and 10,000 acts of violence before they finish elementary school."

"They won't do any other thing, other than eat or sleep, that many times," she says. "That's what we're teaching them. It's okay to kill 8,000 people. It's okay to hurt or maim 10,000 people. It's okay. TV does it, so it's okay."
While true that we have the vantage of hindsight, these assumptions about TV were misguided even then. In a post-Columbine world, we now know that media does not turn otherwise healthy children into killers, be it music, TV, or video games. But it was wrong to assume it was harmful to begin with. Charles Manson was inspired by The White Album, which isn't even the most violent Beatles album. Killers are inspired by their own psychosis, not what they saw on the evening news growing up. Plus, children are surprisingly capable of separating reality and fiction, as well as separating good and bad within fictional narratives. Not to mention the facts and figures given here are pathetically inconsequential when compared to the trove of data the Internet provides to children. The fact that somewhere in the world a child's first cultural memory is watching the bloodied body of Muammar Ghaddafi be paraded around on the hood of a truck on Youtube is not, admittedly, comforting. But the medium is certainly not to blame. You might as well blame weather patterns for thunder scaring your children.

In the interest of full disclosure, my childhood was painted with the cold glow of a 13-inch Sanyo which followed me from house to house. My mother was skeptical of television, barring me from watching The Simpsons and, later on, South Park, but she never shirked from letting it hold my attention through one of her late-night shifts as an ER nurse and, later on, her bed-ridden depressive spells. One of my great childhood memories is of sneaking out of bed when my dad would come home late at night, watching Conan O'Brien ward off lizards a guest animal trainer had brought on or flirt with Heather Locklear. Later, my sister became an MTV obsessive, watching TRL during its Carson Daly heyday. As we entered adolescence, she and I became obsessive fans of Friends (though things got a bit silly after Chandler and Monica married) and Mad About You (no comment). As she entered high school and adapted to something called a "social life", I attached myself to Batllebots, literally the perfect show for twelve-year-old boys. Battlebots, some may recall, came on just before South Park. As my mother's mental issues became more severe, her attention to my television habits became more distant, so I was able to sneak in a few episodes after my nightly robot-fightin' time. This grew into watching what South Park led into, The Daily Show, which has been my favorite cable show ever since (in the words of David Rakoff, "I would drink Jon Stewart's bath water").

So where's television now? I am certainly not the only young person eschewing a cable box for a high-speed modem. Television still exists in the form of streaming services, such as Netflix, Hulu, and less-than-legal options, but sitting down in front of the television for hours at a time is an old habit that died hard. Indeed, those who still do watch television usually do so with laptop or some other device wiring them to the Internet; the television is background noise while the Internt requires your attention.

The Internet used to be plagued by PTA horror stories of online predators, hackers out to steal your identity, and more recently, cyberbullying. Compared to television, though, the Internet has largely outlived these negative connotations. The viewing habits of the Delmar family are quaint compared to an era of dinner table Youtube sessions. So why is it the Internet more solidly secured itself as a safe, recreational source?

Perhaps it's the practical use of the internet. It's less comparable to a TV than it is to a phone, putting us in instant contact with friends, family, and colleagues. We can now pay bills, get an insurance quote, research local mechanics, and wish Grandma a happy 80th birthday all from the same browser window. Whereas TV is such a passive activity, the Internet is also a tool. The more legitimate uses have begun to outweigh the frivolous.

With each new technology comes fears of its overuse. This is merely a naturaly reaction by a society to major change of any kind. But when new technologies prove themselves more than something to stare at, these fears are washed away in the face of ease and convenience.

Is TV doomed? I'm not one for predictions. If sports and news fully engender the Internet on to television's last stronghold, live programming, then what purpose will television serve? TiVos trap you to the couch and game consoles could turn television sets into computer monitors. Much like newspapers, if television wants to survive, they'll need to work at providing their products online in a comfortable, easy manner. Hulu certainly has mastered this game, obtaining streaming rights to hit shows from three of the four major networks. Hulu's ad system is flawless, even allowing you to switch from a commercial irrelevant to your lifestyle to one more fitting. Even their premium service, Hulu+, almost seems worth it (almost). Hulu is perhaps the best example of an industry responding to technological change (other than the music labels success at converting their industry from CD singles to Now That's What I Call Music). Mergers of television and the Internet which have come from the opposite direction have had mixed results; Google TV is largely a flop while Apple TV may be due for a post-Jobs reboot.

Nearly thirty years after the debut of MTV and the world still stands. The parental hysteria of the 90's, laughed away by Nielsen families like the Delmars, was mostly a first-world problem of a country at its most first-worldly. While the Internet does pose some actual dangers to children and parents alike, one can take solace in the story of television's rise from an appliance to a dangerous mind-melting devil then back to an appliance.

Monday, October 17, 2011

1979 Wall Street Occupation


A certainly more playful sit-in in 1979:
This is a short clip from the film "Early Warnings" that details the sit-in that happened on Wall Street on the 50th Anniversary of the 1929 Stock Market Crash. The protesters were demanding an end to financial support for the nuclear industry and the action was part of the larger occupations at the Seabrook Nuclear Power Plant. The costumed figures on stilts are from the Bread and Puppet Theatre. The film is from Green Mountain Post Films.
The occupations at the nuclear power plant often numbered up to 4,000 people. The protests began in April of 1979, one month after the infamous Three Mile Island accident (Editor's Note: I see Three Mile Island in person literally every day).

As Occupy Wall Street quickly becomes considered the movement of our generation, it should be noted that sit-ins are a rather old idea. It's hard to deny their effect on the public debate, which should be the goal of any good protest. I just wonder what happens to the movement when, inevitably, people begin to leave Zucotti Park.

This, after all, is the central problem OWS faces. It's a movement that has become about politics and, therefore, political solutions. But the troubles our generation face are not necessary political; ours is an existential crisis.

If you haven't read Noreen Malone's terrific cover story in New York, "The Kids Are Actually Sort of Alright", I strongly suggest you take the time to do so. An excerpt:
And so we find ourselves living among the scattered ashes and spilled red wine and broken glass from a party we watched in our pajamas, peering down the stairs at the grown-ups. This is not a morning after we are prepared for, to judge by the composite sketch sociologists have drawn of us. (Generation-naming is an inexact science, but generally we’re talking here about the first half of the Millennials, the terrible New Agey label we were saddled with in the eighties.) Clare has us pegged pretty well: We are self-centered and convinced of our specialness and unaccustomed to being denied. “I am sad, jaded, disillusioned, frustrated, and worried,” said one girl I talked to who feels “stuck” in a finance job she took as a stepping-stone to more-fulfilling work she now cannot find. Ours isn’t a generation that will give you just one adjective to describe our hurt.
Our generation is the first generation to expect to be worse off financially than our parents (those of us who grew up poor will speak for ourselves, thank you very much).  If the Lost Generation of WWI was dominated by alienation, this "Lost Generation" is dominated by pessimism. Although that narrative becomes harder to believe when you see the optimism in OWS. I suppose it's possible the protesters are merely the unemployed amongst us with nothing better to do for a month, but NPR's Planet Money blog points out that the nature of a live-in requires cooperation and equality:
We went to the big nightly meeting, which lasts for hours. Everybody has something to say. Along the lines of: Should we buy some sleeping bags? Why does that guy get to run the meeting? What if we just buy fabric and make our own sleeping bags? This kind of back and forth, people told us, is the whole point of Occupy Wall Street. It's not a movement; it's a venue. Standing around, talking about what everybody wants — this is a model of how the protesters want society to be.
Now obviously society as a whole cannot run on mutual participation and the supplies of the charitable for very long. But to believe so, to actually put in practice and have it work represents an astonishingly stiff belief in the inherent good in people. Perhaps one could say this just proves the naivete of a generation that grew up with Participation Awards and gold stars, but this nature of commune also requires equal work. There will always be those who take advantage of the system, but that's true of the societal system we have now. I'm not saying the thin structure of decision-making developed by OWS is a practical model for governments and economies, but it's hard to deny its inspiring nature.

Thursday, October 13, 2011

Books

When journalists begin to wax poetic about the death of newspapers, I'm usually amongst those who shrug. Newspapers are clunky and rarely as informative as a good RSS feed (or even the Drudge Report). TV stations have successfully moved local news to the web. Local newspapers, however, have faded into the background. They want us to pay to read about news that happened yesterday? How cute.

Online news, even the dreaded aggregator's like Matt Drudge and the algorithms of Google, have surpassed even the most widespread and respected of newspapers. They've been replaced by an odd collection of page-based algorithms, hand-edited websites, and social media. Many national papers, such as The New York Times or The Wall Street Journal have successfully become staples of any online diet, but many local newspapers are feeling the pinch. The major newspaper in the Harrisburg area, The Patriot News, actually managed to become two separate entities without imploding its own customer base, dividing itself between its physical newspaper and Pennlive, a local news aggregator. When floods from Hurricane Lee hit the area a month ago, and the surging Susquehanna River was predicted to be mere feet from my front door, I certainly wasn't waiting for a messenger bike to drop off copies of the newspaper. I gave Pennlive a permanent tab on my browser, shooting over every few minutes to see if citywide evacuations were ordered for my neighborhood and whether roads were in a condition to let me go to work. I temporarily opened my WiFi to my neighbors, some of whom lacked cable Internet themselves, so they could likewise stay abreast of weather reports.

The purpose of newspapers, to provide local, relevant news to the masses at a bargain, makes it an especially ripe victim for being outdated. The Internet can do local, it can do relevant, and news sites are largely free. Most important of all, the Internet is fast. Way faster than newspapers could ever hope to be.

Books, however, are a different story. Books are perhaps the only medium other than the Internet to fulfill nearly every informational need we may have. When you read a book, you likely aren't looking for late-breaking news. The purpose of books is to be intensive, to contain information in a permanent, indictable manner. They remain the number-one source of written fiction, and, although most of what you do on the Internet is reading, a "reader" is indubitably a person who reads books. Some websites, like Project Syndicate and Longform, are attempting to bring the art of feature writing to the online masses, but this is perhaps a losing battle. The Internet is too full of distractions to really grab attention with five-pages worth of words. Also, the medium (i.e. computer screens) are physically hard to focus on for the time necessary to read anything of substantial length (the Kindle screen achieves the paper look the same way a digital watch does: black pixels onto a gray screen with no back light). Ironically, the screens of most eReaders have a similar contrast ratio to a newspaper.

While I started this  off by making fun of latter-age journalists for being sentimental about newspapers, books deserve such sentimentality. For one, books go in order; there's no flapping around pages the size of a large sweater to get where you need to be like newspapers require. Second, books hold a presence with their physical attributes. While the old adage is true, even the most elitist of the library crowd makes judgements based on the width of a book's spine. When I read a book, I find myself thumbing the pages I've gone through with my left hand, marvelling at the level of emotion and narrative crammed onto such a primitive stack of tree pulp. I once read an introduction to Anne Karenina which spoke to the power a lengthy book can have over the narrative of your life. A book that takes you weeks or even months to read can shade the memory of that period in your personal biography.

As previously mentioned, books are the main source of written fiction. Stories have a definitive beginning and end, in the same way a book has a front and back. But the Internet? The Internet is almost nonexistent as a physical entity. It exists to most people as a cloudy concept that you'll never hold in your hand. The power of holding such narratives as The Road or Freedom or Lord Of The Rings in a space smaller than a breadbox is phenomenal to feel. Books are subject to none of the problems of modern technology (my bookshelf never has an outage) and most of the benefits (or at least the benefits that would be useful to books). And it is for this reason that, despite how I consider myself a forward-thinking Internet addict, I may never, ever, ever own a Kindle. They take something I already love and enjoy and reduce it to the same frivolity most people give a blog post (oops), not to mention make it more expensive. And sure, the Kindle can hold 10,000 books in a unit, but books will never be as adaptable to this format as albums were to the iPod. Songs can be naturally shuffled the same way we hear them on the radio (the iPod only changed music in that it made every song a single). But could you imagine a random button on the Kindle? Books are the opposite of superfluous and trivial. They demand extended attention, a dwindling quantity. There can be no more doubt that the Internet is fundamentally altering the way we think. And I worry newspapers were the first victim, with books on deck.

Tuesday, October 11, 2011

Concept Lesson: Memes

Before you think this is just a monologue about lolcats and mudkipz, you should know "meme" is a rather old term. Richard Dawkins, in his highly influential 1976 book The Selfish Gene, coined the term to describe cultural ideas that live and die by a critical process of selection (this itself is attached to a much larger idea of collective consciousness, which is roughly 150 years old). In short: ideas must evolve to stay relevant. However, the study of memes should be limited to the ideas, not objects or imagery. As the author James Gleick wrote, "The hula hoop is not a meme....the hula hoop is a meme vehicle." The plastic circle is not what's adapted for survival when the fad took over pop culture in the 1950's. The idea of hula-hooping, of swinging your hips to have the hoop swing around you, is the meme. For instance, as hula hoops fall out of favor with children as toys, the act of hula-hooping is being revived as a fitness tool. In the post-Jazzercize, post-Tae Bo world of Zumba, hula-hooping found its new habitat much like a species that learns to burrow underground to avoid predators.

Not just fads, however, are memes. Music is full of memes: windwheel-strumming a guitar like Pete Townshend was a meme that lived up into the hair-metal heydays of the 1980's but died out when a more subdued style of guitar playing reigned supreme. Indeed, nearly any genre of music could be described with memetics.

The way 99% of Western art depicts Jesus is a meme. In 2002, Popular Mechanics revealed what Jesus Christ of Nazareth probably looked like, and it was far from the sharp, long-haired, white look most of us are familiar with. If we are to believe the canonical Gospels (which contain no physical description of Jesus) are what most early Christians used to base Jesus on, then the Jesus we see in paintings and on crucifixes is a complete incarnation of memetics, going through thousands of years of critical selection to come to the most agreeable version, i.e. the Jesus most suited for the environment of today. Christianity itself is, like all religions, a meme. As societies have grown more tolerant and socially-liberal, religions tend to do likewise or face becoming irrelevant. That is the magic of ideas being restricted by the same rules as organisms.

There's an old adage amongst biochemists: a hen is just an egg's way of making more eggs. This is a simplification of the idea that genes exist independent from their owner. Your genes know about as much about your day-to-day life as you know about the day-to-day life of the Milky Way galaxy. As Dawkins put it, "no matter how much knowledge and wisdom you acquire during your life, not one jot will be passed on to your children by genetic means. Each new generation starts from scratch." You are not the primary concern of your genes; replicating themselves is their primary mission. Evolutionary psychologists will tell you every thing you do, whether consciously or not, is an effort to not only reproduce but sustain a habitat that will also allow your children to reproduce. This means creating and sustaining a way of life that will allow your genes to be passed along as far down the evolutionary chain as possible. While this makes free will seem more like a fluke than an attribute, it means you serve your genes for their own self-replicating purpose.

Memes are ideas that follow these same guidelines. In The Information, Gleick gives the excellent example of "jumping the shark." Representing a definitive notion ("the point in the evolution of a television show when it begins a decline in quality that is beyond recovery"), the phrase evolved to describe any serial production (novels, films, comics) and even to describe similar cultural phenomenons ("jumping the couch", "nuking the fridge"). It encapsulates everything a meme is: a self-replicating idea that evolves to further its own staying power.

However, does this mean communication is just a memes way of making more memes? Communication, after all, is defined by information theory as the transmitting of any information between two sources. Memes, translated via word-of-mouth for most of human history but now primarily done through telecommunications, are now the language of the internet. Let's look at an example:

This is advice dog.

Advice dog, like much other internet idiocy, was popularized by 4chan's /b/, a land of such depravity and ugliness it impresses only the most ardent and disobedient of 13-year-old's. The premise is simple: dog stares out from a rainbow background, uses dichotomy of top and bottom text to impart humorous wisdom. The idea evolved from a photo a user named 'TEM" posted to themushroomkindgom.com, a Super Mario fansite. The dog, named Boba Fett, was then used on /b/ for much lulz or whatever /b/ calls joy in a post-FBI raid era.

The dog and rainbow background are not the subject of study. The idea of posting top-and-bottom text to an image (in a way that was actually a more evolved form of lolcats) is now quite varied, ranging from Douchebag Steve to Socially Awkward Penguin to Rasta Professor. Like any good meme, it has evolved and spread to strengthen its own life.

What the internet does for memetics is accurately track the spread and evolution of memes in ways the fossil record accurately tracks the spread and evolution of species. It equalizes the spread of ideas  (formerly the denizens of marketers, academics, and journalists) to the extent of a teenage joke becoming a cultural phenomenon. However, this means there is now a distinction between a meme which "naturally" (through coincidence, in-jokes, and like-mindedness) comes to popularity versus a meme which is "forced". A "forced meme" is one that cannot naturally exist in the habitat it is exposed to, but persists due to the tried efforts of those seeking popularity or ratings. Planking is largely a forced meme. It serves no purpose and lacks any humorous content, so why has it persisted? It mostly lives on Twitter and morning news shows as a distraction and "Oh, those kids and their crazy internet." However forced it may be, it still fits the role and definition of a meme. The phrase "meme" speaks to no popularity level or degree of authenticity; merely to the spread of an idea. After all, what's the difference if some kid on 4chan or Buzzfeed or The Today Show pushes an idea as long as it is able to self-replicate?

The discussion between a "forced meme" and a "natural meme" represents a division between those who took to the internet (and internet culture) quickly and those who see it as a distraction. As Adrian Chen of Gawker wrote:
"First, the out-of-touch CNNs and Today Shows of the world can pick up an "internet craze" to make it seem like they're hip to what all the kids are doing on their Facebook machines. So it was with planking, which started as a little-known Facebook page before it was seized on and promoted by Australian radio stations looking for web cred. The Today Show was shameless about its social media whoring, posting a picture of Hoda and Kathy Lee's horsemaning to Tumblr with the caption, "BuzzFeed Bait.""
 So aside from the obvious generation gap, forced memes have revealed something else about the internet. If the past ten years of internet history will be remembered for one thing, it will be the monetizing of information. Facebook and Google sell certain aspects of your internet activity to micromarketing companies. Memes are likewise information that can be collected and sold. As I said before, the internet tracks the flow of memes, meaning tracking how many people are familiar with a meme is a cloudy business, but one that produces large numbers. Barack Obama, during his 2008 campaign, received such devoted press because his fans were likewise devoted and would buy any magazine with his face on it (Note: I voted for Obama in 2008 and intend to do the same in 2012). Old-world entertainment like The Today Show and Late Night With Jimmy Fallon are operating on the same level when it comes to internet memes. They see a massive following, so they hope to cash in.

In this way,  a "fad" or "internet craze" is merely a monetized meme. Meme aggregators, like I Can Haz Cheezburger, receive millions of unique page views per month by collecting the thin film off the top of the internet broth. But who is making the things they aggregate? Usually bored kids or office workers, and it is in their minds that we find the "natural meme." When an idea is filtered by the selection process of a TV show or even a website, it can tend to be lopsided and die a quick death. In the Wired interview with with ICHC founder Ben Huh,  says he watched about forty sites devoted to bad memes die (it's when people like this make $4 million that we really should start using the word "bubble").

Planking and horsemaning will either follow suit or be remembered more for their contrivance than their humor or worthiness. But natural memes, those which rise out of shared interests, humor, and way of thinking, tend to extend their staying power well past those forced for marketing reasons. It's actually quite similar to species who are forced into a habitat that is not their own. As any Boy Scout can tell you, animals should only be released into their natural habitat or they will have untold effects on the environment. Memes, once again, follow the same rules as organisms: unless bred from the ingenuity of natural communication, they exist only as eyesores on the social environment.

Saturday, October 8, 2011

The Most Important Thing Steve Jobs Ever Said

Steve Jobs in 1985:

A hundred years ago, if somebody had asked Alexander Graham Bell, “What are you going to be able to do with a telephone?” he wouldn’t have been able to tell him the ways the telephone would affect the world. He didn’t know that people would use the telephone to call up and find out what movies were playing that night or to order some groceries or call a relative on the other side of the globe. But remember that first the public telegraph was  inaugurated, in 1844. It was an amazing breakthrough in communications. You could actually send messages from New York to San Francisco in an afternoon. People talked about putting a telegraph on every desk in America to improve productivity. But it wouldn’t have worked. It required that people learn this whole sequence of strange incantations, Morse code, dots and dashes, to use the telegraph. It took about 40 hours to learn. The majority of people would never learn how to use it. So, fortunately, in the 1870s, Bell filed the patents for the telephone. It performed basically the same function as the telegraph, but people already knew how to use it. Also, the neatest thing about it was that besides allowing you to communicate with just words, it allowed you to sing.
Douglas Adams, famed author of The Hitchhiker's Guide to the Galaxy, was an avid fan of the early Macintosh computers. In an essay he penned for MacUser magazine in 1989, 8 years before even the iMac, Adams characterized the Apple model as "there is no problem so complicated that you can't find a very simple answer to it if you look at it the right way." That sentence is an accurate way to describe those who were trying to break into computing in the 1980's. Computers, even after the Mac II, were clunky to use, requiring a fairly advanced knowledge of scripts and programming to do even simple things, like running a word processor. Computers were, essentially, a puzzle. Much like Morse code, the learning curve of the early personal computer hurdled them from outgrowing the office and becoming what Jobs envisioned they would be (namely, home appliances).

The above quote from Jobs, taken out of a Playboy interview in February 1985, was in response to a question about why, exactly, American families should invest in a $3,000 television that would, for most people, perform the functions of a Speak-n-Spell.  The above quote shows that, even then, before, the iMac, before the iPhone, Steve Jobs' aim was the same of many great innovators: bringing their most advanced and significant products to the masses. Jobs was prescient to what was to come, something boldly titled "the Internet", and he knew that 99% of the people who this could be useful for were not going to bother to learn the complex mechanisms it takes to make it work. It doesn't take much intuition to realize how your air conditioner works, or even how a land line phone works, but a computer is an outstandingly complex device, and they grow more complex every year. The Internet, which has grown far from anything like a telephone into an amorphous, multifunctional universe all its own, is even more complex than the computer I'm using to talk to it.

What Jobs knew wasn't so much what they wanted, but what they didn't. The iMac's setup in 3 steps in an era when most computers came with small novellas about their inner workings is a famous example. When most digital music players resembled TI-83's, Jobs released the iPod, a music player with four buttons. When most smart phones had plastic QWERTY's and a hideous design (both inside and out), Jobs gave us the single-button iPhone, with it's wide glossy touchscreen and smooth-as-silk OS. Like Bell's relationship with the aged telegraph, Jobs allowed himself to rise above the competition, survey their losses, and build off off their mistakes. His adversaries were his own R&D department. This model of domination and elitism led him to be the most productive futurist in history.

Concept Lesson: Generations

(Note: Due to my last post being a bit scattered, this is the first in a series of post in which I'll attempt to define various terms and concepts crucial to my basic thesis)

In genealogical terms, a generation is the time between the birth of a parent and the birth of a child; my parents were in their early thirties when I was born, so the generational distance between us is roughly thirty years. A cultural generation is a group of people born in the same time period and subject to the same cultural movements, fads, and historiography. It's the latter we'll concern ourselves with.

The study of cultural generations is a bit sloppy. Definitions have varied through time and most of it relates to sociology, a field roughly 150 years old (note: few good fields of study are older than 200 years). Auguste Comte (1798-1857) was the first to make an effort to understand the impact being a member of a generation has on an individual, seeing it as similar to citizenship, nationality, or race. An influence of Karl Marx and other conflict theorists of the 1800's, Comte was an odd combination of Enlightenment-era equality and 17th-century Utopian philosophy. He saw humanity in the center of a three-stage developmental process. Before the Enlightenment, man existed in the first stage, the Theological. In this stage, humanity is subject to "god." Not god in a literal sense, but more subject to the idea of god and religion, believing power and morality come from deities and the churches alone. The second stage, coming after the Enlightenment (and especially the Revolution Comte was born into in his native France), is known as the Metaphysical. During the metaphysical stage, humanity is coming to the realization that the individual is the most important aspect of society; man is capable of ruling over himself. Comte believed this process was then (1830's-40's) underway in a political sense with the widespread secularization of Europe and the revolutionary wave of 1848 (this wave is probably the most underrated period of history, leading as it did to the end of serfism and the rise of conflict theory which would in turn lead to World War I, the Russian Revolution, and all those entail). The final stage, known as Positivism, is the basing of society on all areas of science and the scientific method as whole. Decisions of states and individuals would be based on rational thought and proof-based belief systems (one can assume this stage is yet to come). Comte believed that such a focus on science would inevitably lead to the scientific method being targeted towards human interaction, now known as sociology.

The hidden point of Comte's Three Stage Theory is how these waves of change take place. While every student can tell you why we study history ("doomed to repeat" and so forth), Comte believed this was central to our understanding of humanity; each generation must study its predecessors to improve upon their mistakes and create a more positive world. Now, Comte's Three Stage Theory has been removed from most sociological texts (it's rife with logical circles and skips over states that went backwards through these phases, such as Rome), but it's lasting legacy is the belief in the power of generations.

Comte's writing, along with youth-based political movements in Italy, Germany, and Ireland, encouraged the study of generations as groups and agents of change. Note that in the American Revolution and the French Revolution, most of the leaders were well into middle-age and the young (twenty-somethings) rarely played roles larger than that of infantrymen or rioter, rarely considered a "group". Not so much in the aforementioned mid-19th-century movements or even the Arab Spring of today. It is no mistake sociology came about around the same time the idea of evolution became popularized. If man "progressed" in a biological sense, it made sense his institutions and societies would be subject to the same rules. This became known as sociocultural evolution. Generations play an immensely large role in the sense of both terms, though cultural change is (often) a bit faster than biological change, often occuring in a generation or two.

In 1857, shortly after the death of Auguste Comte, philosopher and early sociologist Herbet Spencer published an essay entitled "Progress: Its Law and Cause". Spencer, who coined the term "survival of the fittest", believed all things in the world (biological, physical, sociological) progressed from homogeneous varieties into heterogeneous varieties. Certainly makes sense. New elements are formed through the burning of hydrogen (having atomic number one) in the centers of stars, leading to a variety of elements (117 thus far). Most biologists would agree that all complex life comes from single-celled ancestors. The idea of "complexity" in social structures is a bit harder to define, but not entirely out of the realm of laymen understanding. In 1992, Francis Fukuyama published "The End of History", using the fall of the Soviet bloc as evidence that liberal, American-style democracy was the last evolutionary form of government. Here (according to Fukuyama), it is not because the descendant is more "complex", but merely the most "fit" to survive. In this way, social and government institutions are similar to "memes", ideas which are subject to the same evolutionary rules of adaptation as bacteria, bears, and blue whales. One can see while biologist Daniel Dennet called natural selection "the single best idea anyone ever had."

But in order for evolution to work, new generations (of people or germs or ideas) must adapt to surroundings and institute change. While this is done biologically through DNA and mutations, humanity must use its own playbook. History is the DNA and revolutions, cultural or political, are the mutations which can lead to beneficial (or hazardous) adaptations.

From Spencer's ideas and the widely-publicized theories of Charles Darwin came a more focused look at what it means to be a member of a generation. As the 19th-century marched toward modernity, individuals became less concerned with identifying themselves via clan, family, or nation. Young men began to feel less attached to their fathers as market economies provided more opportunities. Young women found far more systems to leave the trap of domesticity, be it through collegiate education or hard labor, putting more distance between them and the Old World method of identification via the family. Widespread free public education would also lead to young people identifying with those of the same age. By the turn of the century, print media, radio, and telegraphs had encouraged those of a similar age (and therefore sharing a similar social stratification) to identify with one another.

(Note: The above paragraph is a summary of a very complex and storied school of thought branching from collective consciousness theorists. For more, check out the work of Emile Durkheim).

However, it wouldn't be until World War I that massive cultural events would begin to shape the measurement of a generation. The "Generation of 1914", known in the States by the name given to them by Gertrude Stein, "The Lost Generation", was the first generation of youth to be subject to much empirical and cultural study. Fueled partly by the advent of mass media but mostly by the widespread affliction "the Great War" caused, World War I was seen as an event that impacted the generation coming of age during the war, and not just those who fought or died. The works of Hemingway, Fitzgerald, and T.S. Eliot created the shared culture of this Lost Generation, conveying, in varying degrees, the sorrow of loss, the disappointment in leadership, and shared imagery and schema of living in wartime. During the buildup to war, French writer Henri Massis would describe his peers as "a sacrificed generation." Robert Howl, in his seminal 1974 work, The Generation of 1914 (in which he coined the term "generationalist" for those who study history through generations), states "historical generations are not born; they are made." T.S. Eliot echoed Howl's sentiment immediately after the war: "History has many cunning passages/ contrived corridors/ and issues, deceives with whispering ambitions." The overwhelming theme of the Lost Generation, one that would be echoed by nearly all generations thereafter, was the sudden motion with which history seemed to be spreading its power over a generation, leading to an enforced notion of alienation.

From there forward, the history of the 20th century can easily be measured in cultural generations: The Jazz Age, The Greatest Generation, Baby Boomers, Generation X, and Generation Y. But does each of these generations fit the Comte model of a cleansing, revolutionary generation? Of course not. In fact, the world could best be described as an amalgam of generational ideals, with the details and aspects most fit for survival sticking it out to the present.

This is the main point of generational studies (as I see it). While culture is a mirror to society, society is a mirror to the events which shape history, such as economic woes, wars, plagues, colonialism and other widespread struggles. These events are the catalyst for a generation's ideals, which in turn our that generation's culture. This is why culture, and pop culture specifically, are so ripe for study when attempting to understand a generation, both its motives and circumstances.