Monday, December 12, 2011

Radio

When I first selected the postmodern miracle that is "Shuffle All" on my first (and last) iPod, I had the same reaction many people had: It's like my own personal radio station! Lewis Black described this feeling as akin to "the man who first created fire," comparing it unfavorably to the complex mechanisms necessary to switch between tracks on a record player. In 2002, the year the iPod and digital music as a whole really took off, SPIN Magazine dubbed the DJ of the Year "you", four years before TIME magazine made that sort of thing cool.

But it is the connection to radio that I find most interesting. Music radio plays an odd part in most people's pop culture repertoire. It continues to be the main source of learning what music is popular (surviving the relatively brief span of time MTV dominated that role) and yet most people seem to encounter it by chance, in the car or at the gym. It is not, like broadcast television remains (even if dwindling in that capacity), something people typically make a conscious decision to do other than to have something in the background. According to a 2009 study by the "Council For Research Excellence" (which, despite its innocuous name, is totally not a CIA front company), broadcast radio accounts for 79% of all audio media we encounter throughout our day (that includes both terrestrial and satellite radio). Here's the breakdown:
Exposure to audio listening falls into four tiers in terms of level of usage among listeners: (1) broadcast & satellite radio (79.1% daily reach; 122 minutes daily use among users), (2) CDs and tapes (37.1% daily reach; 72 minutes); (3) portable audio [ipods/MP3 players] ( 11.6% daily reach; 69minutes), digital audio stored on a computer such as music files downloaded or transferred to and played on a computer (10.4% daily reach; 65 minutes average use), and digital audio streamed on a computer (9.3% daily reach; 67 minutes) and (4) audio on mobile phones (<2% daily reach; 9 minutes).
The number for mobile phones is the only one I find suspicious, as everyone knows any phone that can play music has a max battery life of 8 minutes.

So nearly 10 years after digital media players became commonplace, and several years after USB ports became standard equipment in most vehicles, broadcast radio still dominates the listening spectrum. If you are not astonished by this, consider the lifespan of radio; it is second only to land-line telephones as the oldest standard for telecommunications in history. In fact, the only thing as shocking as radio's place on top of this study is the Silver Medal going to CD's--and even tapes-- which have only been around 30 years compared to the century of radio.

The study notes that only 10% of people listening to the radio were doing nothing else but listening to the radio--most were busy with work or some other activity (44% of radio listening is done while commuting). This is important if discussing the role of radio. I've recently become enamored with the website 8tracks.com, which allows users to upload a playlist and for others, even nonmembers, to listen in.  Of the Top 5 tags for user-created playlists, 3 describe an activity the playlist is made for rather than the music itself: "Sleep", "sex", and "study", in that order. MP3 players, as well, are usually found around the armbands of early morning joggers or in the cup-holders of early-morning commuters.

Broadcast radio is uniquely fit to play the role of background music: it's constant, free, and everywhere. In most radio markets, every station runs 24 hours. And despite the complete and utter scam that is HD radio, radio remains free and radio units themselves remain extremely cheap. Because of these two factors, radio is unavoidable. It's the most cost-effective and reliable method of tuning out the real world.

As it turns out, being the soundtrack to our own mundane thoughts and habits is good business. NPR, whose continued existence in the face of the 24-hour infotainment onslaught is itself astonishing, has shown steady growth over the same decade people turned more and more to cable TV and online sources for news. In a study published just this week by Arbitron, an international media analysis firm, weekly radio listeners increased by 1.4 million over the course of 2011. Again, we're talking about a century-old technology with absolutely no screen whatsoever growing against the same odds and technological waves currently crushing print media. Not bad for a box the size of a textbook shoved into my dashboard.

In an earlier post, I discussed the dying nature of television. While TV is quite passive, it does require the use of the eyes, which can be distracting when looking for something to fade in and out of during work, working out, or driving. Unlike TV, the radio transitions fairly naturally to portability. And as any music nerd can tell you, even a fully-loaded iPod can become fairly predictable. Radio can bring you traffic and weather, brand new music of nearly any genre, the oddball personalities of disc jockeys, and the sense that a million or more people could be listening to the same thing at the same moment you are (I believe sociologists call this term "community"). So while TV and magazines may die out to the Internet's persistent evolution, radio continues to be your oldest co-worker, turning down the chance to quit and still outpacing the new hires.

Saturday, December 10, 2011

Citizens

The story of social progress in the United States tends to be the upward slope of who is defined as a "citizen." First, of course, it was white male landowners, followed by merely white males. 1865, the Civil War ends, and the 15th Amendment enfranchises all men, regardless of race or place of origin. Of course, the enforcement of the 15th amendment was paltry after less than a decade, but during that decade black men were elected to offices both national and local (Barack Obama was only the fifth black Senator, but one of three black Senators from Illinois). What followed has been called the racial "Nadir", meaning a time when race relations were cold and segregation, de facto or otherwise, became the norm. Literacy tests and poll taxes were specifically constructed to keep blacks and immigrants from voting (not to mention some extralegal methods of scaring people away from polls). The turn of the 20th century saw many states allowing women to vote with a national campaign resulting in the 19th Amendment. So, today, if you are a registered citizen of the United States, you can vote, free-of-charge. It's as close to universal suffrage as a modern society has ever been.

Of course, there are an estimated 17 million undocumented immigrants who, presumably, can not and do not vote (though many obtain an SSI card and driver's license, giving them the materials to register). Then there are the several million people below the age of 18 who work and pay taxes without representation. Followed closely by the 5.3 million Americans who have been convicted of a felony and, therefore, are banned from voting. And all of this relies on a School House Rock image of government and elections. The 2000 presidential election introduced an entire generation to the complex machinations of the electoral college, and the fact that three presidents (7% of presidents) in our history have won without the majority of votes. The electoral college is the lasting memorial to the fear of the masses found within the writings and opinions of that holy group, the Founding Fathers. Any AP Government student can tell you senators used to be chosen by state legislatures, not by direct elections, because the Senate is meant to be a collection of wise-folk to foil the raucous rabble of the directly-elected House. We do not elect our presidents by popular vote because we are not meant to, are not trusted to.

Then there's the way votes are obtained. Let's put aside stuffed ballot boxes. Let's put aside such prevalent and illegal practices such as caging, voter intimidation, and misleading voters as to the location and time of polls. Even if an election is run in the most legal manner possible, the undue influences of old power models and corporate institutions are unavoidable. Corporations have been making donations to political parties and PACs long before Citizens United made it fashionable. The old tactics, known as "soft money", are so called because their use can be decided by the party at hand (usually distributed amongst individual campaigns, even though it's not supposed to). Funding is not just needed to run a campaign; it is, in effect, the entire campaign. A common theme in primary campaigns is a candidate who may be polling well but cannot get the money to continue. Herman Cain ran into this problem. He started with a abysmally low funding and polling to match. As his polls went up, however, his funding did not. This is due to a vast number of reasons, but it essentially comes down to that mysterious quality we call "electability". Investing in a campaign is not like investing in a company; you can't collect and withdraw at your own whim, regardless of the companies long-term sustainability. A candidate needs to win in order for you to see returns in the form of favorable policy.

Now, as Mitt Romney tells me, corporations are people. We now have surpassed the goals of the Enlightenment and entered a new phase of granting civil rights to institutions. A corporation cannot go to the local high school or civic center and pull the lever for a candidate (yet). But why would it want to? We, as citizens, vote because it is the most direct way we can influence politicians. Most politically-active corporations have far deeper pockets than any individual and therefore can have far more influence than a single vote provides. By acting as the fuel for a campaign's spin cycle, a corporation can effectively become the campaign, as Stephen Colbert has been beautifully illustrating with his very own SuperPAC.

Citizens United did not grant corporations the enfranchisement people enjoy; it actually decreased the value and power of a single vote by forcing people to compete with wallets far fatter than we can hope to obtain. I started this post by talking about the increasing equality of enfranchisement through American history. Voting began as a tool for property-owning men to protect their property. But, because laws affect everyone and not just property owners, the right to vote, through 200 years of policy and evolution, spread to all American adults. Now we see that progress being pulled backwards. We could learn quite a bit from the European serfs of the Middle Ages. After all, who do you turn to when the bank owns your house, your car, your education, and now your government? We are beholden to faceless lords who face no regulations, no trials, and no elections. With such status, we can hardly be called "citizens."

Thursday, November 10, 2011

Heroes



Penn State is an odd organization. Run much like a government bureaucracy, its inner workings are so mundane and boring they only necessitate analysis when something bad happens (and something bad rarely happens).

Over the course of the past 48 hours, through news reports, radio interviews, a newspaper, and some grand jury testimony, I got to know Jerry Sanduski. See, I'm not a fan of college football, and while a student at Penn State is expected to live and breathe and eat football, I found college football's rules absurd and inconvenient, its playoff system incomprehensible, and I couldn't get over the simple fact that there are too many teams and too many players (as compared to the NFL, which makes complete sense). So I had no clue who Jerry Sanduski was until about Saturday morning. Turns out he's the former defensive coordinator for Penn State, and also the proprietor of a charity called "The Second Mile". Turns out he also molested up to twenty young boys between 1994 and 2009, escaping capture due to the legal failure of Penn State's Athletic Director Tim Curly and Vice President Gary Schultz and the moral failure of head coach Joe Paterno and President Graham Spanier.

Let me explain Penn State a bit more. The response to these allegations has not been the angry retort of a pissed-off populace against a hated leadership. We like Joe Paterno. We like Graham Spanier. It's a little difficult to contort yourself into hatred against them, even though ignoring a possible sexual abuse claim is absolutely reprehensible. However, as thousands of Happy Valley residents and a few overturned vehicles can attest to, we need to learn that even our heroes can do wrong things and they need to be held accountable when they do.

There's been quite a bit of talk about the inherent "goodness" of Joe Paterno. Says one local editorialist, "All of these men that were involved, excluding Sandusky, are undoubtedly good men." Really? They may have ignored a reasonable and prescient claim of child rape in the interest of protecting a goddamn football team, but aside from that, I'm sure they are all great men who love their wives, Jesus, and America. In fact, one could say that through the annual Four Diamonds fundraiser THON, Penn State is one of the most philanthropic universities in the country. However, a person's morality is only relevant when acted on consistently. Sure, Joe Paterno and Graham Spanier had the potential to be good men in this situation, but they summarily failed.

Let me relate a story. At my high school, an English teacher was fired for carrying on a lustrious and illegal love affair with a male student. The teacher was well-liked by students, relatively popular amongst teachers, and a fairly well-known personality even for those who did not have her in class. While disgust for what she did was fairly widespread, it was difficult for both faculty and students to admonish her outright because of their allegiance to her as a mentor and as a friend. The relative badness of her actions did not match up with our belief that she was, inherently, a good person, creating a degree of cognitive dissonance.

This emotion, fellow Nittany Lions, is called "disappointment". Disappointment in a school which prides itself on the moral high ground. Disappointment in a leadership we trusted appearing to be the worst kind of self-interested organization not even the worst cynics could have predicted it to be. Disappointment in a wholesome folk hero making a large and consequential mistake at the expense of the well-being of 9 children and counting.

Let us not, however, lose sight of the actual villain. Joe Paterno is a nationally-renowned celebrity and the face of Penn State, so yes, he has unfairly become the face of this controversy. But, as allegations become far more undeniable for Sandusky, the selfish evil he unleashed on the childhood of these people, many of them grown adults now, over the course of 15 years is of the worst kind and no human with a conscience will blink if he never sees daylight again. That Sandusky is a bad person is inarguable and self-evident. But, as a famous quote (often misattributed to Edmund Burke but actually from a Russian film narration) goes, "All that is necessary for evil to triumph is for good men to do nothing."

Sunday, November 6, 2011

Materialism

No, not that materialism. I'll let neuroscientist David Eagleman explain:
"The materialist viewpoint states that we are, fundamentally, made only of physical materials. In this view, the brain is a system whose operation is governed by the laws of chemistry and physics-- with the end result that all of your thoughts, emotions, and decisions are produced by natural reactions following local laws to powest potential energy. We are our brain and its chemicals, and any dialing of the knobs of your neural system changes who you are."
The Eagleman book that quote is from, Incognito: The Secret Lives Of The Brain, is a journey into the subconscious which serves as an excellent introduction to the fascinating world of neurobiology. When you are an atheist and are identified at a gathering as such, you are often asked very long-winded questions about the creation of life, the cosmos, and the soul by those with a religion to answer these questions for them. Unless you are a scientific genius yourself, it can be overwhelming to face people who feel it is your responsibility to provide the answers to questions it has taken science millenia to even come close to solving. So it helps to have a book like Eagleman's to explain the science in a clear, understandable, and entertaining fashion.

The above quote about the philosophical view of materialism, while never mentioning the soul, is a direct retort to any worldview which attempts to explain human behavior as having an amorphous, indescribable engine behind it. Indeed, all human behavior, from crying out of the womb to drafting a will, is driven by the baseball-sized glob of neurons and biological gelatin behind your forehead and under your scalp. This view of humanity is, from a theist's standpoint, rather unpoetic and unsatisfying. Guess what? I don't care. The correct explanation is not the one that is most settling, or the most comforting. Galileo knew this when he endorsed the Copernican view that the Earth revolved around the sun, not the other way around. The history of science can be described as diminishing the value of human life until we are only slightly more important than the bacteria under our feet. I'm sorry if this is an unsatisfying view, but the universe does not exist for our own satisfaction.

What the materialist worldview presents is the idea that we are merely the functions of physical systems. Throughout Incognito, Eagleman raises examples of how dependent our judgement and actions are upon our brains. This is not to say that the scientific world currently has explanations for every perceptible action we commit; the key word is "currently." Are there some things science may never discover? Certainly (though particle physics seems to have a lot more questions to answer than neuroscience). But this is not reason to chalk up mysteries to fairies, gods, or spirits. Holes have been left in science for centuries that are just in the past few decades being filled. The important thing to remember is not to pretend we know the answers to things we (currently) do not.

The soul is a comforting thought. It attaches meaning and responsibility to our actions and emotions. The organ that makes your heart tick on time, that reminds you how to ride a bicycle, that allows you type while looking away from the keyboard; that's the brain, most people say. But how I love my partner, how I feel about God, my morals and values and virtues; those are the surely the soul. If you agree with this, you're creating a needless agent for aspects neuroscience already can explain with the brain. Occam's Razor 101.

The main opponent to materialism is merely human emotion. It isn't a pleasant thought to believe that all that you love, all that you hate, all you distrust, and all you save are the products of chaotic and fragmented electrical signals. However, it isn't just our emotions that are trapped in the brain. All those physical machinations actually are ourselves. The brain is not a tool we use to drive the body because we are that tool. Your ability to read this article, your forming opinion of the topics I bring up, and any response you may give are all the products of an unimaginably-complicated physical organ. And when your brain stops, so do you.

Again, this isn't the happiest worldview, but there is beauty to see. Consider the lowly brain functions of ants, which express little emotion other than hunger and fear. Or consider the aforementioned bacterium and single-celled organisms. They have no brain to speak of, no manner of perceiving the world other than simple sensors which guide them towards proteins and lipids. Millions and millions of years of evolution has produced this deceptively-simple organ inside our heads which actually stores the full range of human action and reaction. The decision to get a venti or grande is handled by the same organ which tells you to stay or leave the scene of an accident, respond or ignore an attractive person who confronts you, or start a nuclear war. All the majesty and anger and awe we see in art, poetry, and music is actually a natural product of a handful of jelly inside your head. Now that's satisfying.

Thursday, October 27, 2011

The Internet Will Not Save You

I spend a lot of time and space on this blog defending the Internet as a crowd-sourced, masterful network which can hold and make available all the information we as consumers, citizens, and people would ever want to know. But if I've given the impression that the Internet is solely a revolutionary land of gumdrops and kitten videos, you'll have to forgive me. As I've said before, the Internet is merely a tool to make your life easier. Sure, it allows us to see the evolution of ideas in real-time, and its spread to autocratic dictatorships have proven information to be the most important weapon a people can have. But outside of philosophers and revolutionaries, it is merely a merger between the people who brought you Three's Company and your phone company. The Internet is not your existential savior. The Internet is not your friend. It will not provide meaning to your dull, repetitive life.

In Fahrenheit 451, Ray Bradbury asks us to imagine life without meaningful literature. Characters in the novel are lulled into a catatonic state of interaction when they turn on their "walls" to join their "families", a thin approximation of televisions and sitcoms. The Internet is fulfilling this role in a unique twist Bradbury could not have seen coming. Rather than letting our mental states wallow in the Brownian motion of bad fiction, the Internet has been turned inwards. Instead of analyzing the social circles of JR and Sue Ellen, we spend hours a day pouring over our own friendships and connections. Take this graph from Nielsen, those who watch the watchers:

By nearly every measurement (the exception is bandwidth usage, which Netflix dominated pre-Qwikster), social networking is the number one use of the Internet. Social networking can mean a lot of things; these studies merely track what websites users spend their time on, not what they do on those sites. So according to Nielsen, there's little difference between flirting with a classmate and planning Occupy Maine. But what does it say about the Internet, the world's largest free library, that most of its users are too busy ogling their ex's photo albums to take advantage of the historically significant liberation of information?

Arguments about the eventual downside of artificial intelligence seem to focus on computer's gaining independence outside of mankind. But even the highest forms of artificial intelligence need input from a human and output to another human to be of any real use. All computers are, essentially, telegraphs, receiving the message of one user and transmitting it for another. So it makes perfect sense that the Internet, the closest we have come to a "world brain", would be spent on the same banalities we use our phones for (and the mergers of those two worlds was only inevitable).

Facebook and, to an even greater extent, Twitter have earned quite the reputation in the Third World for allowing the easy and free spread of information. First recognized in 2009, Twitter's use as an activism tool gained a foothold during the Iranian protests against that country's most-certainly fraudulent elections. The Arab Spring of this year has also seen social networks put to use when organizing protests. Twitter was seen to have played such a large role, co-founder Biz Stone was in the running for this year's Nobel Peace prize (the real prize will come if he ever finds a way to make money off of Twitter).

That said, the only revolutions most Facebook users are igniting are are fake campaigns to end child abuse. This fairly recent phenomenon, known as "slacktivism", centers on pointless online efforts to "contribute" to social or political campaigns. No, changing your Twitter profile to a green background did not help Iranian protesters. However, I'm fairly certain most participants are aware of this. Signing an online petition (which, some people need to be told, is completely and utterly useless) is less a solid statement of activism than it is a sign of solidarity like, say, wearing a black armband to class to protest the Vietnam War.

Internet action, as I've said before, is not the same as real action, and it would appear most members of my generation know this. However, becoming too reliant on online tactics can make us forget what tactics really work. The Far Left has been trying to force economic justice as an issue for years, and while their online activities have been numerous and plentiful, the discussion wasn't changed until people began protesting in a real and noticeable manner. The Internet is great for the spread of information, but if that mindset closes when you leave the office chair, then it can all be for nothing.

Wednesday, October 26, 2011

Class

In 1984, my favorite love story, George Orwell's hyper-secure state of permanent warfare and delineated economic class is bottomed out by the "proles" (short for proletariat), an underclass deemed too stupid to worry about the evil machinations of "the Party." In this dystopia, the slogan "proles and animals are free"  dictates the government's attitude towards the poor; they, like animals, lack the decision-making skills to abuse their freedom in any meaningful way and therefore are not a worry of the Party. The novel's hero, Winston Smith, calls the proles the best hope for freedom. "Until they become conscious they will never rebel, and until after they have rebelled they cannot become conscious."

The idea of a clueless but easily influenced working class is not a new one. Like most of 1984, it is based on the ideology of 19th-century conflict theorists like Karl Marx and Max Weber. Decrying the liberalism of the bourgeoisie (middle-class), Marx stated " the proletariat alone is a really revolutionary class." Fear of the working class can also be seen even earlier, amongst American intellectuals in the wake of the 1776 revolution. John Adams and other founders were concerned about "mob rule" in a democracy (hence why Senators were chosen by state legislatures, not elected, until the passing of 17th Amendment), leading to a property requirement for voting. The reason for this is largely the view of the poor as uneducated and easily influenced. If the poor can be guided towards a worldview that congratulates their hard work and romanticizes their poverty, they can lead movements of massive social change, for better or for worse.

The last 40 years have done quite a bit to change this factor in the Western world. College, after the 1950's, was no longer the elite club of intellectuals and rich offspring. In a post-Belushi era, college is a right of passage in the United States. And as long as jobs and credit were flowing freely, this system could sustain itself.

However, as banks deny more and more loans and jobs remain stagnant, the "Lost Generation" of today is far from undereducated. They are, in fact, anchored by the debt they were told to garner in turn for an education. However, the tools to win in this system have failed to prove effective, but they also allow us to understand why we're poor, why our choices are so limited. The proles are no longer rural simpletons; they are a people raised on middle-class dreams, educated by the best universities they were willing to go into debt for, and thrown into a reality which has no need or room for the education they were given.

The phrase "we are the 99%" reveals two things about Occupy Wall Street as a class movement. First is its effort to identify not just as a majority a la Nixon's "Silent Majority", but as a movement which means to represent nearly everyone. It is an "us-versus-them" game, and unless you own a bank (or two), they are on your side. Second, OWS is chiefly a movement about class consciousness. The power corporations have over the middle and lower classes is astonishingly staunch and far older than Citizens United or the 2008 collapse. By breaking away from these outdated models of upper, middle, and lower class (to say nothing of lower-middle and upper-middle class), OWS is seeking to change the way we view class structures. It says to the middle class, "you are merely luckier than I am. We face the same forces in a world designed to benefit the ultra-rich."

In the context of how we define class and the American Dream, this is an astonishing message to build a movement around. Protest movements tend to need individual targets, like Mubarak or Obama or Lyndon Johnson. But to focus less on individuals and more on the very method we diagnose our economy, the very looking glass we peer into, makes the popularity of OWS very unique. OWS is expressing the worldview of most Americans, that our society is heavily bent towards old-money and the well-connected. It is a populist message, and public perception does not equal reality, but a new generation of economists and historians is being raised and educated in this social climate.

It is possible, perhaps even likely, that social commentators are slapping a sticker on OWS too soon. But even if Zucotti Park is emptied tomorrow, the measure of OWS' success will be the public debate. Sadly, the public debate is largely controlled by the attention-deficit media. Admittedly, it's hard to focus on the drum circles and smell of hemp when dictators are being thrown into the frozen food section. But the effect OWS has had, putting income inequality and financial justice on front pages around the world, was precisely its aim. They do not have specific legislative hopes or demand the resignation of any official (at least not collectively). This was a movement by the proles to educate the proles. OWS recognizes the limited effect they will have in the halls of Congress or on the trading floor. The real victory is educating the rest of us, the 99%, about our place.

Friday, October 21, 2011

Television

I just finished reading a 1994 Washington Post feature entitled "Group Portrait With Television", an anthropological review of a suburban family and their television habits. It's odd to remember that television, a dull stream of American consciousness compared to the netherworld of the Internet, was once the target of PSA meetings and parental organizations. The manner in which the article neatly details the exact schedules of the family's viewing habits, down to the size of the consoles (19 inches?! Slow down, America) and the rooms in which they watch, seems to be doing so for shock value. How could a family be okay with watching Maury Povitch every day? The father of the Delmars, the family which serves as the subject of the portrait, has an interesting take on it:
"I just don't buy it that too much TV is bad for you," says Steve, 37, the chief financial officer of a company that makes automated telephone answering systems, who gets home from work around 7, eats dinner while watching Dan Rather and Connie Chung, settles down in the den by the 19-inch Sony, watches a few hours of sports, goes back to the bedroom, turns on the Hitachi and falls asleep with it on because Bonnie can't fall asleep if it's off. "Nobody wants to admit they watch television -- it's got the connotation: 'the boob tube' -- but all these people, what are they doing? I'm not sure if they have any more intellect. It's not like they're all going to the Smithsonian or anything."
America before OJ. America before Monica. America before Osama.

Indeed, television was subject to more scrutiny by parental and medical organizations than any media source before it. As the Nielsen rating system revealed Americans to watch 7.5 hours of TV a week (the number is closer to 6.5 currently) and specialized channels (CNN, ESPN, Nickelodeon, The History Channel, The Food Network) began to dominate more bandwidth, television was king of the 1990's. Bill Clinton, in his 1992 campaign, made a point of supporting V-chips, a now archaic technology which attempted to allow parents to block certain kinds of programming. Parents in the article are horrified when a child health specialist holds a seminar at the local elementary school to espouse the dangers of too much television:
She turns on the TV and shows a videotape, in which the announcer says that "in a typical television season, the average adolescent views 14,000 instances of sexual contact or sexual innuendo in TV ads and programs."

She turns on an opaque projector and shows a chart that says: "Most children will see 8,000 murders and 10,000 acts of violence before they finish elementary school."

"They won't do any other thing, other than eat or sleep, that many times," she says. "That's what we're teaching them. It's okay to kill 8,000 people. It's okay to hurt or maim 10,000 people. It's okay. TV does it, so it's okay."
While true that we have the vantage of hindsight, these assumptions about TV were misguided even then. In a post-Columbine world, we now know that media does not turn otherwise healthy children into killers, be it music, TV, or video games. But it was wrong to assume it was harmful to begin with. Charles Manson was inspired by The White Album, which isn't even the most violent Beatles album. Killers are inspired by their own psychosis, not what they saw on the evening news growing up. Plus, children are surprisingly capable of separating reality and fiction, as well as separating good and bad within fictional narratives. Not to mention the facts and figures given here are pathetically inconsequential when compared to the trove of data the Internet provides to children. The fact that somewhere in the world a child's first cultural memory is watching the bloodied body of Muammar Ghaddafi be paraded around on the hood of a truck on Youtube is not, admittedly, comforting. But the medium is certainly not to blame. You might as well blame weather patterns for thunder scaring your children.

In the interest of full disclosure, my childhood was painted with the cold glow of a 13-inch Sanyo which followed me from house to house. My mother was skeptical of television, barring me from watching The Simpsons and, later on, South Park, but she never shirked from letting it hold my attention through one of her late-night shifts as an ER nurse and, later on, her bed-ridden depressive spells. One of my great childhood memories is of sneaking out of bed when my dad would come home late at night, watching Conan O'Brien ward off lizards a guest animal trainer had brought on or flirt with Heather Locklear. Later, my sister became an MTV obsessive, watching TRL during its Carson Daly heyday. As we entered adolescence, she and I became obsessive fans of Friends (though things got a bit silly after Chandler and Monica married) and Mad About You (no comment). As she entered high school and adapted to something called a "social life", I attached myself to Batllebots, literally the perfect show for twelve-year-old boys. Battlebots, some may recall, came on just before South Park. As my mother's mental issues became more severe, her attention to my television habits became more distant, so I was able to sneak in a few episodes after my nightly robot-fightin' time. This grew into watching what South Park led into, The Daily Show, which has been my favorite cable show ever since (in the words of David Rakoff, "I would drink Jon Stewart's bath water").

So where's television now? I am certainly not the only young person eschewing a cable box for a high-speed modem. Television still exists in the form of streaming services, such as Netflix, Hulu, and less-than-legal options, but sitting down in front of the television for hours at a time is an old habit that died hard. Indeed, those who still do watch television usually do so with laptop or some other device wiring them to the Internet; the television is background noise while the Internt requires your attention.

The Internet used to be plagued by PTA horror stories of online predators, hackers out to steal your identity, and more recently, cyberbullying. Compared to television, though, the Internet has largely outlived these negative connotations. The viewing habits of the Delmar family are quaint compared to an era of dinner table Youtube sessions. So why is it the Internet more solidly secured itself as a safe, recreational source?

Perhaps it's the practical use of the internet. It's less comparable to a TV than it is to a phone, putting us in instant contact with friends, family, and colleagues. We can now pay bills, get an insurance quote, research local mechanics, and wish Grandma a happy 80th birthday all from the same browser window. Whereas TV is such a passive activity, the Internet is also a tool. The more legitimate uses have begun to outweigh the frivolous.

With each new technology comes fears of its overuse. This is merely a naturaly reaction by a society to major change of any kind. But when new technologies prove themselves more than something to stare at, these fears are washed away in the face of ease and convenience.

Is TV doomed? I'm not one for predictions. If sports and news fully engender the Internet on to television's last stronghold, live programming, then what purpose will television serve? TiVos trap you to the couch and game consoles could turn television sets into computer monitors. Much like newspapers, if television wants to survive, they'll need to work at providing their products online in a comfortable, easy manner. Hulu certainly has mastered this game, obtaining streaming rights to hit shows from three of the four major networks. Hulu's ad system is flawless, even allowing you to switch from a commercial irrelevant to your lifestyle to one more fitting. Even their premium service, Hulu+, almost seems worth it (almost). Hulu is perhaps the best example of an industry responding to technological change (other than the music labels success at converting their industry from CD singles to Now That's What I Call Music). Mergers of television and the Internet which have come from the opposite direction have had mixed results; Google TV is largely a flop while Apple TV may be due for a post-Jobs reboot.

Nearly thirty years after the debut of MTV and the world still stands. The parental hysteria of the 90's, laughed away by Nielsen families like the Delmars, was mostly a first-world problem of a country at its most first-worldly. While the Internet does pose some actual dangers to children and parents alike, one can take solace in the story of television's rise from an appliance to a dangerous mind-melting devil then back to an appliance.

Monday, October 17, 2011

1979 Wall Street Occupation


A certainly more playful sit-in in 1979:
This is a short clip from the film "Early Warnings" that details the sit-in that happened on Wall Street on the 50th Anniversary of the 1929 Stock Market Crash. The protesters were demanding an end to financial support for the nuclear industry and the action was part of the larger occupations at the Seabrook Nuclear Power Plant. The costumed figures on stilts are from the Bread and Puppet Theatre. The film is from Green Mountain Post Films.
The occupations at the nuclear power plant often numbered up to 4,000 people. The protests began in April of 1979, one month after the infamous Three Mile Island accident (Editor's Note: I see Three Mile Island in person literally every day).

As Occupy Wall Street quickly becomes considered the movement of our generation, it should be noted that sit-ins are a rather old idea. It's hard to deny their effect on the public debate, which should be the goal of any good protest. I just wonder what happens to the movement when, inevitably, people begin to leave Zucotti Park.

This, after all, is the central problem OWS faces. It's a movement that has become about politics and, therefore, political solutions. But the troubles our generation face are not necessary political; ours is an existential crisis.

If you haven't read Noreen Malone's terrific cover story in New York, "The Kids Are Actually Sort of Alright", I strongly suggest you take the time to do so. An excerpt:
And so we find ourselves living among the scattered ashes and spilled red wine and broken glass from a party we watched in our pajamas, peering down the stairs at the grown-ups. This is not a morning after we are prepared for, to judge by the composite sketch sociologists have drawn of us. (Generation-naming is an inexact science, but generally we’re talking here about the first half of the Millennials, the terrible New Agey label we were saddled with in the eighties.) Clare has us pegged pretty well: We are self-centered and convinced of our specialness and unaccustomed to being denied. “I am sad, jaded, disillusioned, frustrated, and worried,” said one girl I talked to who feels “stuck” in a finance job she took as a stepping-stone to more-fulfilling work she now cannot find. Ours isn’t a generation that will give you just one adjective to describe our hurt.
Our generation is the first generation to expect to be worse off financially than our parents (those of us who grew up poor will speak for ourselves, thank you very much).  If the Lost Generation of WWI was dominated by alienation, this "Lost Generation" is dominated by pessimism. Although that narrative becomes harder to believe when you see the optimism in OWS. I suppose it's possible the protesters are merely the unemployed amongst us with nothing better to do for a month, but NPR's Planet Money blog points out that the nature of a live-in requires cooperation and equality:
We went to the big nightly meeting, which lasts for hours. Everybody has something to say. Along the lines of: Should we buy some sleeping bags? Why does that guy get to run the meeting? What if we just buy fabric and make our own sleeping bags? This kind of back and forth, people told us, is the whole point of Occupy Wall Street. It's not a movement; it's a venue. Standing around, talking about what everybody wants — this is a model of how the protesters want society to be.
Now obviously society as a whole cannot run on mutual participation and the supplies of the charitable for very long. But to believe so, to actually put in practice and have it work represents an astonishingly stiff belief in the inherent good in people. Perhaps one could say this just proves the naivete of a generation that grew up with Participation Awards and gold stars, but this nature of commune also requires equal work. There will always be those who take advantage of the system, but that's true of the societal system we have now. I'm not saying the thin structure of decision-making developed by OWS is a practical model for governments and economies, but it's hard to deny its inspiring nature.

Thursday, October 13, 2011

Books

When journalists begin to wax poetic about the death of newspapers, I'm usually amongst those who shrug. Newspapers are clunky and rarely as informative as a good RSS feed (or even the Drudge Report). TV stations have successfully moved local news to the web. Local newspapers, however, have faded into the background. They want us to pay to read about news that happened yesterday? How cute.

Online news, even the dreaded aggregator's like Matt Drudge and the algorithms of Google, have surpassed even the most widespread and respected of newspapers. They've been replaced by an odd collection of page-based algorithms, hand-edited websites, and social media. Many national papers, such as The New York Times or The Wall Street Journal have successfully become staples of any online diet, but many local newspapers are feeling the pinch. The major newspaper in the Harrisburg area, The Patriot News, actually managed to become two separate entities without imploding its own customer base, dividing itself between its physical newspaper and Pennlive, a local news aggregator. When floods from Hurricane Lee hit the area a month ago, and the surging Susquehanna River was predicted to be mere feet from my front door, I certainly wasn't waiting for a messenger bike to drop off copies of the newspaper. I gave Pennlive a permanent tab on my browser, shooting over every few minutes to see if citywide evacuations were ordered for my neighborhood and whether roads were in a condition to let me go to work. I temporarily opened my WiFi to my neighbors, some of whom lacked cable Internet themselves, so they could likewise stay abreast of weather reports.

The purpose of newspapers, to provide local, relevant news to the masses at a bargain, makes it an especially ripe victim for being outdated. The Internet can do local, it can do relevant, and news sites are largely free. Most important of all, the Internet is fast. Way faster than newspapers could ever hope to be.

Books, however, are a different story. Books are perhaps the only medium other than the Internet to fulfill nearly every informational need we may have. When you read a book, you likely aren't looking for late-breaking news. The purpose of books is to be intensive, to contain information in a permanent, indictable manner. They remain the number-one source of written fiction, and, although most of what you do on the Internet is reading, a "reader" is indubitably a person who reads books. Some websites, like Project Syndicate and Longform, are attempting to bring the art of feature writing to the online masses, but this is perhaps a losing battle. The Internet is too full of distractions to really grab attention with five-pages worth of words. Also, the medium (i.e. computer screens) are physically hard to focus on for the time necessary to read anything of substantial length (the Kindle screen achieves the paper look the same way a digital watch does: black pixels onto a gray screen with no back light). Ironically, the screens of most eReaders have a similar contrast ratio to a newspaper.

While I started this  off by making fun of latter-age journalists for being sentimental about newspapers, books deserve such sentimentality. For one, books go in order; there's no flapping around pages the size of a large sweater to get where you need to be like newspapers require. Second, books hold a presence with their physical attributes. While the old adage is true, even the most elitist of the library crowd makes judgements based on the width of a book's spine. When I read a book, I find myself thumbing the pages I've gone through with my left hand, marvelling at the level of emotion and narrative crammed onto such a primitive stack of tree pulp. I once read an introduction to Anne Karenina which spoke to the power a lengthy book can have over the narrative of your life. A book that takes you weeks or even months to read can shade the memory of that period in your personal biography.

As previously mentioned, books are the main source of written fiction. Stories have a definitive beginning and end, in the same way a book has a front and back. But the Internet? The Internet is almost nonexistent as a physical entity. It exists to most people as a cloudy concept that you'll never hold in your hand. The power of holding such narratives as The Road or Freedom or Lord Of The Rings in a space smaller than a breadbox is phenomenal to feel. Books are subject to none of the problems of modern technology (my bookshelf never has an outage) and most of the benefits (or at least the benefits that would be useful to books). And it is for this reason that, despite how I consider myself a forward-thinking Internet addict, I may never, ever, ever own a Kindle. They take something I already love and enjoy and reduce it to the same frivolity most people give a blog post (oops), not to mention make it more expensive. And sure, the Kindle can hold 10,000 books in a unit, but books will never be as adaptable to this format as albums were to the iPod. Songs can be naturally shuffled the same way we hear them on the radio (the iPod only changed music in that it made every song a single). But could you imagine a random button on the Kindle? Books are the opposite of superfluous and trivial. They demand extended attention, a dwindling quantity. There can be no more doubt that the Internet is fundamentally altering the way we think. And I worry newspapers were the first victim, with books on deck.

Tuesday, October 11, 2011

Concept Lesson: Memes

Before you think this is just a monologue about lolcats and mudkipz, you should know "meme" is a rather old term. Richard Dawkins, in his highly influential 1976 book The Selfish Gene, coined the term to describe cultural ideas that live and die by a critical process of selection (this itself is attached to a much larger idea of collective consciousness, which is roughly 150 years old). In short: ideas must evolve to stay relevant. However, the study of memes should be limited to the ideas, not objects or imagery. As the author James Gleick wrote, "The hula hoop is not a meme....the hula hoop is a meme vehicle." The plastic circle is not what's adapted for survival when the fad took over pop culture in the 1950's. The idea of hula-hooping, of swinging your hips to have the hoop swing around you, is the meme. For instance, as hula hoops fall out of favor with children as toys, the act of hula-hooping is being revived as a fitness tool. In the post-Jazzercize, post-Tae Bo world of Zumba, hula-hooping found its new habitat much like a species that learns to burrow underground to avoid predators.

Not just fads, however, are memes. Music is full of memes: windwheel-strumming a guitar like Pete Townshend was a meme that lived up into the hair-metal heydays of the 1980's but died out when a more subdued style of guitar playing reigned supreme. Indeed, nearly any genre of music could be described with memetics.

The way 99% of Western art depicts Jesus is a meme. In 2002, Popular Mechanics revealed what Jesus Christ of Nazareth probably looked like, and it was far from the sharp, long-haired, white look most of us are familiar with. If we are to believe the canonical Gospels (which contain no physical description of Jesus) are what most early Christians used to base Jesus on, then the Jesus we see in paintings and on crucifixes is a complete incarnation of memetics, going through thousands of years of critical selection to come to the most agreeable version, i.e. the Jesus most suited for the environment of today. Christianity itself is, like all religions, a meme. As societies have grown more tolerant and socially-liberal, religions tend to do likewise or face becoming irrelevant. That is the magic of ideas being restricted by the same rules as organisms.

There's an old adage amongst biochemists: a hen is just an egg's way of making more eggs. This is a simplification of the idea that genes exist independent from their owner. Your genes know about as much about your day-to-day life as you know about the day-to-day life of the Milky Way galaxy. As Dawkins put it, "no matter how much knowledge and wisdom you acquire during your life, not one jot will be passed on to your children by genetic means. Each new generation starts from scratch." You are not the primary concern of your genes; replicating themselves is their primary mission. Evolutionary psychologists will tell you every thing you do, whether consciously or not, is an effort to not only reproduce but sustain a habitat that will also allow your children to reproduce. This means creating and sustaining a way of life that will allow your genes to be passed along as far down the evolutionary chain as possible. While this makes free will seem more like a fluke than an attribute, it means you serve your genes for their own self-replicating purpose.

Memes are ideas that follow these same guidelines. In The Information, Gleick gives the excellent example of "jumping the shark." Representing a definitive notion ("the point in the evolution of a television show when it begins a decline in quality that is beyond recovery"), the phrase evolved to describe any serial production (novels, films, comics) and even to describe similar cultural phenomenons ("jumping the couch", "nuking the fridge"). It encapsulates everything a meme is: a self-replicating idea that evolves to further its own staying power.

However, does this mean communication is just a memes way of making more memes? Communication, after all, is defined by information theory as the transmitting of any information between two sources. Memes, translated via word-of-mouth for most of human history but now primarily done through telecommunications, are now the language of the internet. Let's look at an example:

This is advice dog.

Advice dog, like much other internet idiocy, was popularized by 4chan's /b/, a land of such depravity and ugliness it impresses only the most ardent and disobedient of 13-year-old's. The premise is simple: dog stares out from a rainbow background, uses dichotomy of top and bottom text to impart humorous wisdom. The idea evolved from a photo a user named 'TEM" posted to themushroomkindgom.com, a Super Mario fansite. The dog, named Boba Fett, was then used on /b/ for much lulz or whatever /b/ calls joy in a post-FBI raid era.

The dog and rainbow background are not the subject of study. The idea of posting top-and-bottom text to an image (in a way that was actually a more evolved form of lolcats) is now quite varied, ranging from Douchebag Steve to Socially Awkward Penguin to Rasta Professor. Like any good meme, it has evolved and spread to strengthen its own life.

What the internet does for memetics is accurately track the spread and evolution of memes in ways the fossil record accurately tracks the spread and evolution of species. It equalizes the spread of ideas  (formerly the denizens of marketers, academics, and journalists) to the extent of a teenage joke becoming a cultural phenomenon. However, this means there is now a distinction between a meme which "naturally" (through coincidence, in-jokes, and like-mindedness) comes to popularity versus a meme which is "forced". A "forced meme" is one that cannot naturally exist in the habitat it is exposed to, but persists due to the tried efforts of those seeking popularity or ratings. Planking is largely a forced meme. It serves no purpose and lacks any humorous content, so why has it persisted? It mostly lives on Twitter and morning news shows as a distraction and "Oh, those kids and their crazy internet." However forced it may be, it still fits the role and definition of a meme. The phrase "meme" speaks to no popularity level or degree of authenticity; merely to the spread of an idea. After all, what's the difference if some kid on 4chan or Buzzfeed or The Today Show pushes an idea as long as it is able to self-replicate?

The discussion between a "forced meme" and a "natural meme" represents a division between those who took to the internet (and internet culture) quickly and those who see it as a distraction. As Adrian Chen of Gawker wrote:
"First, the out-of-touch CNNs and Today Shows of the world can pick up an "internet craze" to make it seem like they're hip to what all the kids are doing on their Facebook machines. So it was with planking, which started as a little-known Facebook page before it was seized on and promoted by Australian radio stations looking for web cred. The Today Show was shameless about its social media whoring, posting a picture of Hoda and Kathy Lee's horsemaning to Tumblr with the caption, "BuzzFeed Bait.""
 So aside from the obvious generation gap, forced memes have revealed something else about the internet. If the past ten years of internet history will be remembered for one thing, it will be the monetizing of information. Facebook and Google sell certain aspects of your internet activity to micromarketing companies. Memes are likewise information that can be collected and sold. As I said before, the internet tracks the flow of memes, meaning tracking how many people are familiar with a meme is a cloudy business, but one that produces large numbers. Barack Obama, during his 2008 campaign, received such devoted press because his fans were likewise devoted and would buy any magazine with his face on it (Note: I voted for Obama in 2008 and intend to do the same in 2012). Old-world entertainment like The Today Show and Late Night With Jimmy Fallon are operating on the same level when it comes to internet memes. They see a massive following, so they hope to cash in.

In this way,  a "fad" or "internet craze" is merely a monetized meme. Meme aggregators, like I Can Haz Cheezburger, receive millions of unique page views per month by collecting the thin film off the top of the internet broth. But who is making the things they aggregate? Usually bored kids or office workers, and it is in their minds that we find the "natural meme." When an idea is filtered by the selection process of a TV show or even a website, it can tend to be lopsided and die a quick death. In the Wired interview with with ICHC founder Ben Huh,  says he watched about forty sites devoted to bad memes die (it's when people like this make $4 million that we really should start using the word "bubble").

Planking and horsemaning will either follow suit or be remembered more for their contrivance than their humor or worthiness. But natural memes, those which rise out of shared interests, humor, and way of thinking, tend to extend their staying power well past those forced for marketing reasons. It's actually quite similar to species who are forced into a habitat that is not their own. As any Boy Scout can tell you, animals should only be released into their natural habitat or they will have untold effects on the environment. Memes, once again, follow the same rules as organisms: unless bred from the ingenuity of natural communication, they exist only as eyesores on the social environment.

Saturday, October 8, 2011

The Most Important Thing Steve Jobs Ever Said

Steve Jobs in 1985:

A hundred years ago, if somebody had asked Alexander Graham Bell, “What are you going to be able to do with a telephone?” he wouldn’t have been able to tell him the ways the telephone would affect the world. He didn’t know that people would use the telephone to call up and find out what movies were playing that night or to order some groceries or call a relative on the other side of the globe. But remember that first the public telegraph was  inaugurated, in 1844. It was an amazing breakthrough in communications. You could actually send messages from New York to San Francisco in an afternoon. People talked about putting a telegraph on every desk in America to improve productivity. But it wouldn’t have worked. It required that people learn this whole sequence of strange incantations, Morse code, dots and dashes, to use the telegraph. It took about 40 hours to learn. The majority of people would never learn how to use it. So, fortunately, in the 1870s, Bell filed the patents for the telephone. It performed basically the same function as the telegraph, but people already knew how to use it. Also, the neatest thing about it was that besides allowing you to communicate with just words, it allowed you to sing.
Douglas Adams, famed author of The Hitchhiker's Guide to the Galaxy, was an avid fan of the early Macintosh computers. In an essay he penned for MacUser magazine in 1989, 8 years before even the iMac, Adams characterized the Apple model as "there is no problem so complicated that you can't find a very simple answer to it if you look at it the right way." That sentence is an accurate way to describe those who were trying to break into computing in the 1980's. Computers, even after the Mac II, were clunky to use, requiring a fairly advanced knowledge of scripts and programming to do even simple things, like running a word processor. Computers were, essentially, a puzzle. Much like Morse code, the learning curve of the early personal computer hurdled them from outgrowing the office and becoming what Jobs envisioned they would be (namely, home appliances).

The above quote from Jobs, taken out of a Playboy interview in February 1985, was in response to a question about why, exactly, American families should invest in a $3,000 television that would, for most people, perform the functions of a Speak-n-Spell.  The above quote shows that, even then, before, the iMac, before the iPhone, Steve Jobs' aim was the same of many great innovators: bringing their most advanced and significant products to the masses. Jobs was prescient to what was to come, something boldly titled "the Internet", and he knew that 99% of the people who this could be useful for were not going to bother to learn the complex mechanisms it takes to make it work. It doesn't take much intuition to realize how your air conditioner works, or even how a land line phone works, but a computer is an outstandingly complex device, and they grow more complex every year. The Internet, which has grown far from anything like a telephone into an amorphous, multifunctional universe all its own, is even more complex than the computer I'm using to talk to it.

What Jobs knew wasn't so much what they wanted, but what they didn't. The iMac's setup in 3 steps in an era when most computers came with small novellas about their inner workings is a famous example. When most digital music players resembled TI-83's, Jobs released the iPod, a music player with four buttons. When most smart phones had plastic QWERTY's and a hideous design (both inside and out), Jobs gave us the single-button iPhone, with it's wide glossy touchscreen and smooth-as-silk OS. Like Bell's relationship with the aged telegraph, Jobs allowed himself to rise above the competition, survey their losses, and build off off their mistakes. His adversaries were his own R&D department. This model of domination and elitism led him to be the most productive futurist in history.

Concept Lesson: Generations

(Note: Due to my last post being a bit scattered, this is the first in a series of post in which I'll attempt to define various terms and concepts crucial to my basic thesis)

In genealogical terms, a generation is the time between the birth of a parent and the birth of a child; my parents were in their early thirties when I was born, so the generational distance between us is roughly thirty years. A cultural generation is a group of people born in the same time period and subject to the same cultural movements, fads, and historiography. It's the latter we'll concern ourselves with.

The study of cultural generations is a bit sloppy. Definitions have varied through time and most of it relates to sociology, a field roughly 150 years old (note: few good fields of study are older than 200 years). Auguste Comte (1798-1857) was the first to make an effort to understand the impact being a member of a generation has on an individual, seeing it as similar to citizenship, nationality, or race. An influence of Karl Marx and other conflict theorists of the 1800's, Comte was an odd combination of Enlightenment-era equality and 17th-century Utopian philosophy. He saw humanity in the center of a three-stage developmental process. Before the Enlightenment, man existed in the first stage, the Theological. In this stage, humanity is subject to "god." Not god in a literal sense, but more subject to the idea of god and religion, believing power and morality come from deities and the churches alone. The second stage, coming after the Enlightenment (and especially the Revolution Comte was born into in his native France), is known as the Metaphysical. During the metaphysical stage, humanity is coming to the realization that the individual is the most important aspect of society; man is capable of ruling over himself. Comte believed this process was then (1830's-40's) underway in a political sense with the widespread secularization of Europe and the revolutionary wave of 1848 (this wave is probably the most underrated period of history, leading as it did to the end of serfism and the rise of conflict theory which would in turn lead to World War I, the Russian Revolution, and all those entail). The final stage, known as Positivism, is the basing of society on all areas of science and the scientific method as whole. Decisions of states and individuals would be based on rational thought and proof-based belief systems (one can assume this stage is yet to come). Comte believed that such a focus on science would inevitably lead to the scientific method being targeted towards human interaction, now known as sociology.

The hidden point of Comte's Three Stage Theory is how these waves of change take place. While every student can tell you why we study history ("doomed to repeat" and so forth), Comte believed this was central to our understanding of humanity; each generation must study its predecessors to improve upon their mistakes and create a more positive world. Now, Comte's Three Stage Theory has been removed from most sociological texts (it's rife with logical circles and skips over states that went backwards through these phases, such as Rome), but it's lasting legacy is the belief in the power of generations.

Comte's writing, along with youth-based political movements in Italy, Germany, and Ireland, encouraged the study of generations as groups and agents of change. Note that in the American Revolution and the French Revolution, most of the leaders were well into middle-age and the young (twenty-somethings) rarely played roles larger than that of infantrymen or rioter, rarely considered a "group". Not so much in the aforementioned mid-19th-century movements or even the Arab Spring of today. It is no mistake sociology came about around the same time the idea of evolution became popularized. If man "progressed" in a biological sense, it made sense his institutions and societies would be subject to the same rules. This became known as sociocultural evolution. Generations play an immensely large role in the sense of both terms, though cultural change is (often) a bit faster than biological change, often occuring in a generation or two.

In 1857, shortly after the death of Auguste Comte, philosopher and early sociologist Herbet Spencer published an essay entitled "Progress: Its Law and Cause". Spencer, who coined the term "survival of the fittest", believed all things in the world (biological, physical, sociological) progressed from homogeneous varieties into heterogeneous varieties. Certainly makes sense. New elements are formed through the burning of hydrogen (having atomic number one) in the centers of stars, leading to a variety of elements (117 thus far). Most biologists would agree that all complex life comes from single-celled ancestors. The idea of "complexity" in social structures is a bit harder to define, but not entirely out of the realm of laymen understanding. In 1992, Francis Fukuyama published "The End of History", using the fall of the Soviet bloc as evidence that liberal, American-style democracy was the last evolutionary form of government. Here (according to Fukuyama), it is not because the descendant is more "complex", but merely the most "fit" to survive. In this way, social and government institutions are similar to "memes", ideas which are subject to the same evolutionary rules of adaptation as bacteria, bears, and blue whales. One can see while biologist Daniel Dennet called natural selection "the single best idea anyone ever had."

But in order for evolution to work, new generations (of people or germs or ideas) must adapt to surroundings and institute change. While this is done biologically through DNA and mutations, humanity must use its own playbook. History is the DNA and revolutions, cultural or political, are the mutations which can lead to beneficial (or hazardous) adaptations.

From Spencer's ideas and the widely-publicized theories of Charles Darwin came a more focused look at what it means to be a member of a generation. As the 19th-century marched toward modernity, individuals became less concerned with identifying themselves via clan, family, or nation. Young men began to feel less attached to their fathers as market economies provided more opportunities. Young women found far more systems to leave the trap of domesticity, be it through collegiate education or hard labor, putting more distance between them and the Old World method of identification via the family. Widespread free public education would also lead to young people identifying with those of the same age. By the turn of the century, print media, radio, and telegraphs had encouraged those of a similar age (and therefore sharing a similar social stratification) to identify with one another.

(Note: The above paragraph is a summary of a very complex and storied school of thought branching from collective consciousness theorists. For more, check out the work of Emile Durkheim).

However, it wouldn't be until World War I that massive cultural events would begin to shape the measurement of a generation. The "Generation of 1914", known in the States by the name given to them by Gertrude Stein, "The Lost Generation", was the first generation of youth to be subject to much empirical and cultural study. Fueled partly by the advent of mass media but mostly by the widespread affliction "the Great War" caused, World War I was seen as an event that impacted the generation coming of age during the war, and not just those who fought or died. The works of Hemingway, Fitzgerald, and T.S. Eliot created the shared culture of this Lost Generation, conveying, in varying degrees, the sorrow of loss, the disappointment in leadership, and shared imagery and schema of living in wartime. During the buildup to war, French writer Henri Massis would describe his peers as "a sacrificed generation." Robert Howl, in his seminal 1974 work, The Generation of 1914 (in which he coined the term "generationalist" for those who study history through generations), states "historical generations are not born; they are made." T.S. Eliot echoed Howl's sentiment immediately after the war: "History has many cunning passages/ contrived corridors/ and issues, deceives with whispering ambitions." The overwhelming theme of the Lost Generation, one that would be echoed by nearly all generations thereafter, was the sudden motion with which history seemed to be spreading its power over a generation, leading to an enforced notion of alienation.

From there forward, the history of the 20th century can easily be measured in cultural generations: The Jazz Age, The Greatest Generation, Baby Boomers, Generation X, and Generation Y. But does each of these generations fit the Comte model of a cleansing, revolutionary generation? Of course not. In fact, the world could best be described as an amalgam of generational ideals, with the details and aspects most fit for survival sticking it out to the present.

This is the main point of generational studies (as I see it). While culture is a mirror to society, society is a mirror to the events which shape history, such as economic woes, wars, plagues, colonialism and other widespread struggles. These events are the catalyst for a generation's ideals, which in turn our that generation's culture. This is why culture, and pop culture specifically, are so ripe for study when attempting to understand a generation, both its motives and circumstances.

Thursday, September 15, 2011

Modern Standards of Time (Ramble Tambel #1)

The telegraph, the first invention of instant long-distance travel, created a new idea of "now." When Lincoln received telegraphs from Gettysburg, it required a new way of thinking to grasp that General Pickett had begun his march "now", at the exact same time Lincoln was reading the telegraph in Washington. Time and space, funny enough, were intertwined in the brain. The farther away something was occuring, the further away you were from learning about it. But with the telegraph (and the telephone and radio and television and the internet), time became universal. Any college grad will (hopefully) be able to tell you that time zones came about so the railroad companies could have an agreed upon schedule. But it was the telegraph which even made the concept of time zones possible. The world was suddenly a lot busier.

This lead to a very sudden acceleration of nearly all aspects of culture. With telegraph cables came the instant conversation between stock exchanges in New York, London, and Tokyo, all feeding and responding to each other in real time. Boom and bust periods became world wide phenomenons; the failure of banks in Europe meant the falling of stocks in New York. War became a far more intriguing battle of intelligence. While spies were always a part of warfare, a solid telegraph system (even Napoleon's visual telegraphs) could foretell the movement of troops miles (and days) away. Media grew as the news did; newspapers had something new to report every day, merely because the world never sleeps. Be they a broker, spy, or journalist, one could send the most important messages to their target longitudinal measures away, and this changed culture from coasting to full-on flooring the proverbial pedal.

Fashions, fads, and "memes"  coalesced into a completely new concept: popular culture. Suddenly, what was popular in New York could be popular in Chicago and Los Angeles. It's amazing to think what could not have existed without the telegraph. Sports would have moved at a glacial pace, with scores and statistics moving with the Postal Service. Clothing and fashion would remain local, inspired not by runways and models but by local celebrities and religious customs. Of course, the telegraph was just the first step in telecommunications.

The internet, unique in its democratic distribution and catholic coverage, took local scenes and magnified them into worldwide phenomenons. The new school of "memes", far removed from Richard Dawkin's concept of ideas acting like genes or even viruses, made in-jokes known within weeks, days, or hours. Consider the image macro. Macros, such as "lolcats", were in a unique spectrum of communication, changing common sentiments on image boards (such as the now infamous 4chan) into simply posted and humorous jokes. Most people who have seen lolcats now have little idea of their original purpose. Macros have expanded into a culture all their own, with their own language, rules, and iconography. But with a quick rise often comes a sudden fall.

The term "internet craze" now seems terribly outdated, as the internet is no longer its own culture, a niche market. The internet is now a mirror of the culture at large in the same way all popular culture exists. Fame that used to last years now last months, if that, before the spinwheel of the internet replaces it with something else. This is due, in part, to the limit of exposure before something becomes overexposed or "overplayed", a concept that's been around as long as FM radio. "Cool" is often a quality possessed very briefly, and even more brief for those made cool out of "irony", or the concept of the cultural artifact deemed so critically bad, it loops back around to being good. We enjoy these things for the same reason we enjoy parodies.

Rebecca Black is a good and fairly recent example of this. Even now, a mere 6 months after her video "Friday" has been seen by everyone with a modem, she seems outdated, overplayed. However, the video succeeded because it was mundane, superficial, and completely readable. The aim of Black, or at least her producer the ARK Music Factory, is so obvious, so inauthentically executed, it was "doomed to fame", as it were. The language of the internet requires the kind of analysis that made this video an instant hit. The internet requires a constant questioning of motives and authenticity. Even now, unless you know me personally (or perhaps especially if you know me personally), you have already made a judgment on whether this website is verifiable or trustworthy. I'm not exactly reporting the news, or pretending I have some scholarly credit to my name, but you know what to look for. It's a Blogger page, which means he doesn't have the means or know-how to set up his own URL. It's simply designed, meaning he's either a hipster or wishing Tumblr provided more room for text. No ads? Why is he even running this site?

We are a generation of skeptics. Our first real cultural moment, after all, was our President changing his mind about who he had sexual relations with. Indeed, nearly every major question our generation has been asked has had to do with authenticity and motives. Why do they hate us? Why did we invade Iraq? Why is Paris Hilton famous? These are questions which continue to be debated to this day (though I suppose we could replace Paris Hilton with Kim Kardashian). Every cultural craze we have come to face has been centered on the question of authenticity. Reality shows are contrived, everyone knows it, but the mere idea of reality playing out on television brings tens of millions of viewers. But are these people on TV characters or actual people? Shows like The Hills, which hired actors to play reality TV subjects, and The Office, which used the imagery of reality TV but made no bones about its fictional nature, are built on this blend of realism and fantasy. While this is not a new debate for a generation to answer, the internet has propelled this question into our daily lives: How can we distinguish between fact and fiction when we can manufacture either?

The rise of 9/11 conspiracy theories (or even moon landing conspiracy theories, which existed prior to the internet but only took off after a 2001 FOX special) can be considered part of this. It's hard to imagine something so massive and so visible being questioned in an era before Photoshop. Indeed, Photoshop has changed our perception of nearly everything we see online. While photograph manipulation existed as long as photographs have, Photoshop made even the most spectacular ideas capable of presenting themselves in a realistic context.

Tuesday, August 23, 2011

The Internet and You.

Over at Slate, Annie Murphy Paul has a thought-provoking review of Duke University's Cathy N. Davidson's new book, Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn. Primarily a work for those in education, Paul aimed her review towards Davidson's usual fair of finding extravagant meaning behind every day technological innovations (often taking, as Paul points out, generous leaps through logical hoops to do so). Davidson's argument centers on the effect technology is having on the brains of those who have grown up with it. She claims that the sheer amount of what we're capable of through the internet, smart phones, and even video games is changing the neurology of "digital natives", those of us who have grown up never knowing a world without computers.

While technology is unarguably changing the way we use our brains (quick: list as many telephone numbers as you can), the response to this change seems to be largely negative. The Atlantic's Nicholas Carr wrote a much-ballyhooed column headlined "Is Google Making Us Stupid?" The idea is, your brain uses less energy to remembering things it knows it can find at a keystroke. I mentioned the death of remembering a telephone number; in-car GPS navigators present perhaps a more stark example, with humans literally following directions mindlessly, often to places they've been many times before. Here's what one blogger wrote on the subject:
For this invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory. Their trust, produced by external characters which are no part of themselves, will discourage the use of their own memory within them. You have invented an elixir not of memory, but of reminding; and you offer your pupils the appearance of wisdom, not true wisdom.
 Wait. That's not a blogger. That's Socrates. In 320 BC. Talking not about Google or iPhones, but the highly advanced and elitist technology of writing (it should be noted I found this quote in James Gleick's fantastic book, The Information). Socrates, after all, did not write anything we associate with him (or, as far as we know, anything at all). Neither did Homer. The mysterious, blind poet of Ancient Greece authored and passed on The Iliad and The Odyssey purely through an oral tradition. This reality makes the doomsayers of the loss of our memory's to the microchip seem rather quaint. What's that? You wrote your Master's thesis on a legal pad? That's cool, I guess. Try memorizing this bad boy.

While the basic sentiment of the worries of folks like Carr and Paul is correct (we are remembering less the more we have instant access to), I don't really see the problem with this. Shall we ditch calculators because they lessen our ability to solve massive math problems when left to our own (ahem) devices? For that matter, we better throw out the compass. And maps. And dictionaries. Call Dmitri Mendeleev: we won't be needing that nifty periodic table once all chemists have put in the time to memorize the atomic mass, weight, and number for all 117 elements.

Having a problem with how the internet is training our brains is not just backwards: it's an assault on what we have come to know as knowledge. The opposite argument to the extremes I presented would be, why learn anything at all if Wikipedia can tell me? Well, Wikipedia, like any other resource, is the product of human knowledge. When we write the time for an appointment on a Post-It and put it on our fridge, the Post-It has not come to dominate us. We humans remain the source (even for Jeopardy-champ Watson). And, far more importantly, once we've put the information out there, it becomes instantly available for others, who then gain the knowledge themselves. When we take money out of the ATM, the ATM is not the receiver of the message. It simply tells the people at the bank what money we've taken out. Lest we devolve into Matrix-style slaves, technology will always remain the middle-man.