Wednesday, August 22, 2007

reflections

Tomorrow is the last day of the quarter, and it's time for me to reflect a bit about what I learned in this course. Here goes:

• I learned that no matter what, if you put something on the Internet about yourself, make it something you actually want people to know. All people. Potential employers, ex-boyfriends, parents; anybody. If it's there, people will find it.

• Some of the futuristic stuff we were afraid of will never happen. Some of it has already happened. All of those tv shows and movies that portrayed the future (The Jetsons, Back to the Future, etc.) were all right and all wrong. No I don't have a hoverboard or a robot maid, but I do have the Internet.

• Not everything on the Internet is true, especially not Wikipedia.

• Not everything on the Internet is real, especially not Wikipedia contributors.

• The first people figure out to make the Internet profitable are the people that didn't burst in the dot-com bubble.

• People either love or hate Andrew Keen. There is no middle ground.

• I can't think of any profession that cannot have some sort of affiliation with the Internet in the coming years. Not one. That's scary, but hopefully good.

So the future is here, and I studied it, and I wrote about it, and I learned I'll have to shape up and get used to it or there won't be a place for me in the communications field. There will still be a demand for writers and editors, just with slightly augmented skill sets. There will still be a demand for reference librarians and researchers. They will not be replaced by Google. I'm glad I know these things now, before it's too late for me.

I might occasionally still post here if I run across a particularly interesting snippet of new-media info and want to note it somewhere. We'll see! Bye!

Monday, August 20, 2007

Notes from all over

A few news stories that piqued my new media interest lately. All are from the Times, except one, which makes me wonder if they really are the web-phobics Keen put them out to be.

A new method of television delivery?

E-mail forces better greeting cards.

The CD is 25.

"Digg or Die"; academia's "publish-or-perish" mentality meets the Internet.

BoingBoing continues to bash Andrew Keen. This time with Stephen Colbert.

And now, a story: I have been searching for an album by one Miss Alice Smith. Local stores tell me it's not possible to find. eBay doesn't have it for less than $40. None of the illegal downloading areas had it. It's stuck in a limbo of being a good album by a not-so-well-known artist that didn't sell so well when it came out last year and now is unavailable. But they have it on iTunes.

The point of this story is that in finding this album, I utilized several types of new media, and tracking it down would have been extremely difficult had these technological advancements never happened.

A song from the album was featured on an episode of the HBO series "Entourage," which I watch obsessively because it's very funny. (On a side note, apparently this show doesn't appeal to women because the women characters aren't prominent and the male characters can be chauvinistic jerks. I don't care. Funny is funny and people should recognize that.) Anyway, the song is played over the closing credits. I watched the show on digital cable on a Sunday night, when it first aired. Then, I watched it again during lunch on Tuesday utilizing the On Demand feature of my digital cable. I scanned the credits for the name of the song, but didn't find it, so I went to my computer and Googled a snippet of the lyrics I could remember along with "Entourage" and the title of the episode. This led me to Smith's MySpace page, where the song was available for listening, but not downloading. I listened to the four songs of hers on the site and decided I liked them enough to buy the album.

Here's where the New Media portions break down. You see, I wanted this CD right away. So instead of first scouring the web for it, I got into my car and drove to the local Best Buy. They didn't have it. Then I went to Barnes and Noble (to get a book, but I stopped by the music section and they told me it wasn't in their computer). Back at home, eBay had it for $44, with two days left to bid it up higher. Finally, I searched for it in iTunes, and there it was. For $9.99 I could have the whole album instantly on my computer. This was not exactly what I wanted.

You see, I am one of those dorky people that still collects vinyl record albums. I still buy CDs. I still listen to music on a stereo instead of just on my computer. I like to have an item to correspond with, an object to hold in my hands and liner notes to look through while I listen. Call me old fashioned, but there's something about the ephemeral nature of downloaded music that doesn't appeal to me. I mean, I know it's still music, but it isn't exactly the same. What if my computer crashed? What if I drop my iPod?

I ended up buying the album from iTunes and burning a CD immediately. It isn't the same though.

I guess what I'm trying to say is that even though new media and the internet and ecommerce and all of these things make certain things accessible that weren't before, it doesn't necessarily make them, or the experiences associated with them, any better. Going to record stores and book stores used to be a Saturday afternoon pasttime. My friends and I would flip through used albums and CDs for hours, then drive over to the used bookstores and do the same thing with pages. Now, we just type up things we want and have them shipped, new or used, straight to our houses. But we're missing the chase. We're missing happening upon something interesting by chance. We're missing the community aspect as well, since we don't do this together as a social outing anymore. We aren't squeezing it between breakfast out and a late matinee movie.

If I'd happened upon the Alice Smith CD in a used CD bin at a local store on this hypothetical Saturday afternoon, there would have been this tiny, thrilling moment when I saw it. I'd pick it up and open it, check out the liner notes and the disc for scratches. I'd take it to the counter and pay for it and maybe have a quippy conversation with the clerk. My friends and I would have listened to it in the car on the way to our next stop and talked about it, or the episode of "Entourage" that featured it. And sure, I can go online and comment about it on Smith's MySpace, or on the "Entourage" fan message boards, but something about that, like buying the album from iTunes, has an ephemeral quality that isn't endearing, but cheapening. If anything, this course has taught me to really appreciate the personal interactions the Internet has made unnecessary. Sure, I appreciate being able to find or buy just about anything from my own sofa, and the comfort in knowing that for a price, anything, even out-of-print CDs can be found with just clicks and keystrokes. But where did spontaneity go? What about happenstance?

Wednesday, August 15, 2007

the Me generation and the Web generation collide

Last year, I read a book called "Generation Me" by Jean M. Twenge. Dr. Twenge's book is about "Generation Me," or, as other people call it "the Entitlement Generation." This generation consists of people born during the 80s and 90s, many of whom are just now entering college. Basically, and this book includes me in this generation, since I was born in 1981, we want everything (high-paying flexible jobs, lots of free time, etc.) and don't want to work for it. (This perplexes me personally, and I would hate for a prospective employer to consider that I had these traits.) Anyway, Dr. Twenge's book, published just last year, has a conspicuous chunk missing: the Internet. The index does list a few occurrences of the word "Internet" in the book, but most of them are in passing.

I'm re-reading this book right now to prepare to teach college freshmen how to write essays. I start tomorrow, and I'm pretty nervous, and this book isn't helping. According to this book, my students will be spoiled brats that have never worked hard. The last chapter has sections on how employers and marketers can use this information about this generation, butt nothing specifically for educators. There is a section that says parents shouldn't brow-beat their Gen-Me-er into college, and a section that says parents shouldn't be obsessive about their child's self-esteem being high at any cost. Yesterday in my faculty orientation, we had a discussion about FERPA. FERPA says that professors are not allowed to discuss grades (or other information) with a students' parent without permission. Apparently, lots of parents get the professor's phone number from the syllabus and call professors, demanding information about their children's grades or attendance. While it would be illegal for the professor to do so, the parents don't quit.

These are the parents obsessed with what their kids are doing on MySpace and Facebook. These are the parents watching the Dateline "To Catch A Predator" specials in fear. These are what college administrators call "helicopter parents," and why others call the cell phone "the world's longest umbilical cord." Where was this parental paranoia ten years ago when I was getting ready for college?

I believe there IS a connection to be made from the Internet to this phenomenon. It's clear. Because of the Internet, these parents see their children as taking freedoms before they really deserve them, so the parents, in turn, hold on to the freedoms they should be letting go of.

Thursday, August 9, 2007

conversations at work

The director of another college communications department in the office where I work came to my office on Tuesday looking puzzled. He was carrying an older issue of the student newspaper I advise. As I often see people carrying that newspaper and looking puzzled, I wasn't alarmed.

"So," he said, "Did you read this?"

I draw in a deep breath to start my well-rehearsed spiel about how I do not read the paper before it is printed because that isn't ethical and ... I realize he's pointing to a word in a letter to the editor. And that word is "Wikipedia." In the letter, a student is pointing out an error in a previous student newspaper article. The letter-writer suggests that the newspaper editors should have looked at Wikipedia.

"Do you think they get the irony here?" he asked me.

"No. I don't," I said, sighing.

He looked at me puzzled.

"People don't know what they don't care to know. And obviously, people don't care to know that Wikipedia isn't exactly 100% credible."

He mumbles something about teenagers, and I shot him a look. "No, no, adults too."

He rolls his eyes, obviously remembering adults, seemingly informationally-savvy adults that are also swindled by Wikipedia.

I pull Andrew Keen's book from my bag and hold it out to him. In what felt like a spiritual moment in the religion of information, he took it.

"He understands," I said.

My visitor returned yesterday, singing Keen's praises. I said hold on, look at this. I showed him where yesterday, BoingBoing.com called Keen a "professional troll and spokesdouche for internet-fearing reactionaries."

We discussed, and realized that yeah, of course BoingBoing is going to react that way, because BoingBoing's whole position is Internet=everything great about culture. It's almost as if your position defines your opinions on everything. I'm a journalist, a dyed-in-the-wool pen-and-paper journalist, so of course I think the Internet is a bit overrated. But I'm charged with churning out the next generation of journalists because I'm a newspaper adviser (and now also an English professor), so doesn't that position demand I be a steward of technology, no matter how useless it might be?

Bottom line: it's bells-and-whistles vs. meat-and-potatoes. Until they can figure out how to fill those Internet bells with meat, I'm not buying.

Monday, August 6, 2007

the unpopular opinion (paper) DRAFT

This is a rough draft of a position-type editorial article for people who do my job: college media advisers. My goal was to argue the unpopular side of a popular argument, that college media needs to produce many different types of media in their newsgathering and presentation functions (i.e. web video, audio and story packages) for every story. Please note that not only is this a draft (my opinions and support statements aren't 100% fleshed out), this is not an attack on any of the people mentioned, but a discussion of their ideas and my ideas.

Bryan Murley is a man on a mission. Since a last-minute session at the fall college media conference in 2005, Murley has been the go-to guy for all things new media. His blog, Innovation in College Media, has become required reading for college media advisers everywhere and his analysis of emerging trends in new media convergence in college news operations are spot on. Just yesterday he reminded us we need to have some sort of video plan for our reporters and photojournalists right now. Anybody can see that much of consumer newsgathering is done on the web nowadays, or at least, not in the ink-and-paper newspaper format, but does that necessarily mean college newspapers need to become multiplatform multimedia news delivery systems overnight? Or at all?

Labeled as anti-technology in 2007 is definitely an unpopular place to be. Nobody knows this better than Andrew Keen. His book “The Cult of the Amateur” has put him on the hit list of every pro-Web blogger and commentator on the Web, but his opinions and ideas are at least giving people something to think about as they forage ahead into new media territory. Keen’s theory is simple: more voices does not necessarily equal quality information. He frequently compares the “chatter” on the Internet to T.H. Huxley’s “infinite monkey theorem,” where infinite monkeys have infinite typewriters and one eventually creates a masterpiece. In the pre-Internet age (T.H. Huxley died in 1895), the concept of infinite monkeys on typewriters seemed like a mathematical joke, but now, with the Internet, it seems more like a sad metaphor. Keen says this joke has become a reality. We are the monkeys, chattering away on our typewriters, but none of us are creating a masterpiece yet. It should be noted here that T.H.’s grandson, Aldous Huxley, wrote the dystopian future in the novel Brave New World in 1932.

Keen’s argument goes on for more than 200 pages, showing us the crimes committed by Web 2.0 against talent, authorship, intellectual property laws, musicianship, advertising revenue and journalists. Journalists, traditionally educated craftsmen, are being muscled out because … well, because of a lot of things. They’re fond of blaming the Internet. Craigslist and Yahoo are stealing their ad revenue, which pays their salaries. Wikipedia and celebrity gossip sites are stealing their readers. Google News is taking their content. The Economist says that over the next few decades, half of the newspapers will go under. Keen bundles this idea by attacking the most famous newspaper in the United States: The New York Times. According to Michael Wolff, a journalist, for the Times to generate its print revenue, they’d either have to reach 400-500 million online readers or compromise their editorial content to please potential advertisers. Granted, this is The New York Times, not your average university weekly tabloid. But not even the Times would able to seamlessly abandon it’s paper pages for web pages “At best,” Wolff says,” it might become a specialized Internet player … a low-end, high-volume information producer.” What does that sound like? Oh yeah, Wikipedia.

Even the most thorough and cutting-edge Webophiles will confess to having their doubts about Wikipedia. If you’re looking for the infinite monkey theorem at work, this is where to find it. It’s flaw, however, isn’t that it can be edited by anyone from Harvard professors to Harvard janitors, but that it doesn’t take any particular talent to contribute. Journalists are celebrated for their storytelling ability and creativity. How you tell a story is your selling point. That you can tell it better than anybody else is your ticket to a journalism job. Wikipedia fails here because it doesn’t really tell stories, and those it does tell, like Wikipedia entries on news and current events, are dull, bland, play-by-play lists devoid of any sort of analysis or insight. Why? Because analysis and insight aren’t appropriate for the encyclopedia medium. There’s an old saying in radio, from back when most news couldn’t be reported live on the radio, that they can just wait a day and make it analysis. Are we sacrificing the analysis to get things earlier? Is that worth it? Is this happening in college media? Is that worth it?
Lots of people think so. Journalism professors and college media advisers alike seem to agree that loading their students with as many different skills possible will make them more viable in the shrinking journalism job market. Ryan Sholin, a new media blogger and graduate student at San Jose State University says it’s clear, “J-schools can either play a critical role in training the next generation of journalists, or they can fade into irrelevancy.” That’s a steep order. He also says “Reporters need to do more than write. The new world calls for a new skill set, and you and Mr. Notebook need to make some new friends, like Mr. Microphone and Mr. Point & Shoot.” Steeper still. Think of your average journalism-school freshman. Jenny Freshman wrote a few articles for her high school paper and wants to be a journalism student to refine her writing skills. Too bad. Jenny has to be a radio producer, photographer, videographer, anchor, editor, web designer and maybe do some writing in between. The jack-of-all-trades journalist model might be appropriate for a senior student looking at job options, but for the freshman, or the first-time contributor to a college student newspaper, it isn’t a realistic starting place.

The flip-side negative of being a jack-of-all-trades, like some believe all journalists (or at least journalists that will keep their jobs in the next ten years) should become, is that it’s hard to focus and refine a skill when you’re always trying to add more. Plus, it’s hard to develop new skills without strong foundation skills. Let’s go back to Jenny Freshman, who’s trying to write her first lead. In a supportive learning environment, she should get ample time to craft a strong print lead, through thorough trial-and-error. Once she gets that skill, it’s a lot easier to then teach her to modify that lead for radio or television or the web. It’s harder to try to teach leads across platforms because you have no foundation skill (like writing a print lead) to build from. Just like this lead example, in early reporting classes, students should be learning how to tell stories the easiest way, through writing, just like they’ve been writing essays and papers throughout their educational careers. By building on those skills, early print reporting classes can help transition students into journalistic writing. Once they get a solid foundation in journalistic storytelling, they can easily translate those skills into different media. Another drawback of this jack-of-all-trades model is the lack of specialization. There’s no reward for being exceptionally good at any journalistic skill if everyone is expected to be competent in many areas. For example, when I was in journalism school, I was a good reporter, but only a mediocre copy editor. Luckily, I knew people that were and we could help each other. There is something to be said for specialization and roles in team projects. If you worked at a newspaper, the same staff member wouldn’t be taking the photos, writing the stories, designing the paper, cleaning the office and selling the ads. It makes more sense for the best photographer to take the photos, the best designer to layout the pages and the best salespeople to sell the ads. Plus, it gives people, especially students, a sense of identity to figure out what area they’re best at and what area they want to pursue. This way they can focus their energy and really excel in that area, instead of being too busy trying to be competent in everything.

As educators, we also must be mindful of what the students want out of the experience. This is different at every college. A journalism-school student at a large public school with an established daily college paper may need, or want, skills that a student editor at a weekly tabloid at a fine arts college may not. Likewise, students at that large university may expect different things from their college newspaper than students at the small art college. Advisers are always preaching audience appropriateness, and it should extend even to this area. If your student body wants in-depth, multimedia coverage of campus events, give it to them. But if your students aren’t interested, and would just like a newspaper to read while they’re waiting for class to start, you shouldn’t do unnecessary work that won’t be appreciated. Additionally, this work could detract effort from your existing product, and further harm your relationship to your readers. The readers come first, and in college student media, it’s easier to find out what they want. As advisers and educators, though, we must also uphold certain standards of quality. If students aren’t building their foundational skills as writers and reporters, we must be the ones to help them focus. Like Keen says, having more monkeys at more typewriters (and in this case, additional monkeys with typewriters, cameras, videocameras, audio recorders and web design software) doesn’t necessarily mean quality work will come out of it. Quality is the top priority here, and producing quality work should be every student media organization’s goal.

Even if you don’t agree with everything Keen has to say, hopefully you can agree with this. Keen says, early on, “Talent, as ever is a limited resource, the needle in today’s digital haystack.” Is our obligation as college media advisers to make those needles, or the haystack of mediocrity? Is our obligation to make signal or noise? Is our profession going to literally be in the hands of the typewriting monkeys? We have the power to stop this from happening. We have the power to train the new generation to be needles instead of hay, and, more importantly, to know the difference.

Wednesday, August 1, 2007

internet personality

Today, while I was on Facebook, I came across an interesting application. Recently, Facebook has added tons of applications; things you can add to your Facebook profile. I added an application called "Scrabulous," which lets me play Scrabble through Facebook with my other friends on Facebook, and an application called "Personal DNA." Basically, the Personal DNA application is a personality test. It's a well-designed (i.e. pretty), interactive test that lets you rank personal preferences and then it gives you an outcome. The scary thing is that it's dead accurate.

My results are here:


If you click on to read them, it basically says I'm exactly like myself. It says I'm self-reliant, generous, practical and hard-working. What I'm wondering though, is did I know this before the Internet told me?

In high school career ed class, we took the Myers-Briggs type indicator. I was an INTJ. This was in 1998. I re-took it today, on the Internet, and I'm still an INTJ. I feel like this is pretty accurate as well. To further test the accuracy, I had a friend take it. I've known this person for about seven years, and he's never taken a test like this. He's also an INTJ, and that's pretty accurate as well. It's strange, though, because I recognize these traits in him and in myself, but I didn't really ever think of us as similar. I probably could have come to this conclusion without the Internet as well.

So, I did all this, and realized that it's just another way, like Facebook, or MySpace to read and write and talk about myself. Now I feel like I was swindled by the Internet. It tricks me into thinking I'm special and interesting, when really, I'm just another person taking an online personality test. While this isn't exactly Web 2.0, it's sort of a stepping stone between Web and Web 2.0. It's less committal than MySpace or Facebook, because you aren't forced to share your results, but it still engages you in a kind of self-focus that many Web 2.0 things focus on.

Tuesday, July 31, 2007

Keen and The Liquid Library

I think one of the reasons I’m having a hard time dissecting this Keen book is the sheer amount of information crammed into it. For example, I just read a short section (not even a chapter, like a sub-chapter) called “The Liquid Library” that was about four pages, but it gave me so much to think about and discuss.

The section is about Kevin Kelly, who Keen calls a “Silicon Valley utopian.” “Kelley wants to kill off the book entirely,” Keen writes. “As well as the intellectual property rights of writers and publishers.”

Doesn’t that sound like Keen is being a little harsh? Well, it’s pretty true. In a 2006 New York Times Magazine article, Kelly basically said he wants all of literature to be on the web and “cross-linked, clustered, cited, extracted, indexed, analyzed, annotated, remixed, reassembled, and woven deeper into the culture than ever before.” Wow. To Kelly, it means “a web of names and a community of ideas.” To Keen, it means “the death of culture.”

Both of these arguments are hard to swallow whole because both of these guys are playing in the extremes. I think that’s where all of these arguments for and against Web 2.0 break down. When you say something as incendiary as “no more intellectual property rights,” it’s easy to attract a deluge of naysayers, and it’s equally true if you say something as incendiary as “Web 2.0 will kill culture.” Haven’t these people ever heard of balance?

I’m guessing this is where so much of the criticism and anti-Keen backlash is coming from. He’s one of those all-or-nothing kind of guys. He’s throwing out the baby with the bathwater, along with the bathtub and maybe even the whole bathroom. And even if Keen is wrong, or a blustering Goliath, he’s successfully stirring up debate and thought on what will be the long-term effects of the shift from regular old Internet to Web 2.0. That can’t be discounted.

I can’t wait to read who he attacks next.

Monday, July 30, 2007

Wikipedia + Jessica = love/hate

In Andrew Keen’s book (which I am still reading, not because I’m a slow reader, but because I’m trying to enjoy it), he talks a lot about Wikipedia. And why shouldn’t he? Everybody else is. I got an email from my eLearning professor about not using Wikipedia as a source, in her class or anyone else’s. Stephen Colbert is making up fake words about Wikipedia. Even old media are talking, like a 2006 New Yorker article. It’s almost impossible not to have an opinion of Wikipedia, no matter if you’ve never even used the Internet.

Last summer, at our family beach house, my grandmother (who has barely used a computer, let alone the Internet) wanted some information. My cousin, a web-savvy teenage boy, printed her a Wikipedia page.
“So it’s like an encyclopedia on the Internet,” she said.
He should have just nodded.
“It’s open source, anybody can edit it,” he said.
“So it’s fake?”
I tried to help.
“It’s democratized.”
I did not help. This was the beginning of my Wikipedia opinion, and at this point I was skeptical.

The arguing point here is reliability. This really is the thousand monkeys sitting at a thousand computers producing something that, while it may not be brilliant, is popular. According to Wikipedia’s Wikipedia page, the site receives between 10,000 and 30,000 page requests per second. Another Wikipedia page, titled “Reliability of Wikipedia” details tests done by third parties to assess the accuracy of Wikipedia. Newspapers and magazines have done numerous studies and assessments with various people to see really how reliable it is. Most of these tests found that it was reliable, however, it still doesn’t have much of an authority. Authority is connected to reliability, but isn’t the same. Jessica Clary saying that Wikipedia is possibly unreliable is one thing, but The New York Times saying the same thing is more significant, because The New York Times has more authority. It’s older. It’s bigger. It’s written and published more.

Then I read a lot of Wikipedia, and I started to like it. It was good for general information, and answering those tip-of-your-tongue questions. What was that TV show where Alan Ruck was a TV writer? Oh yeah, “Going Places.” What show was on before it? “Perfect Strangers.” When was it on? 9:30 p.m. on Friday night on ABC. This is information you couldn’t find in Encyclopedia Britannica.

But then I started using Wikipedia for other things. I wanted to write an essay about Cotard’s syndrome, and in class I was complaining that the Wikipedia page was lacking. I realized it wasn’t Wikipedia’s fault, but mine. I did a search of Google Scholar (Google’s search engine for scholarly papers and journal articles) and found plenty of information. This is the type of thing Wikipedia isn’t good for.

I agree with my eLearning professor that Wikipeda should not be used as source material in academic work, not because of its peer-editing system, but because it’s too generalized, just like any encyclopedia. It’s good for background information or pop-culture trivia, but not for real research. This brought me into my phase of loving Wikipedia. Because I knew what it was good for, I used it constantly to solve the pop-culture questions I got in my head. What’s that song in the new Jetta commercial? Oh it’s the Silversun Pickups song “Kissing Families.” Where are the Silversun Pickups from? Silver Lake in Los Angeles. What other bands are from there? Rilo Kiley and Elliott Smith. Who’s that guy in Rilo Kiley that was a child actor? Blake Sennett. Wikipedia reading flows like a choose-your-own-adventure book. This was the height of my love affair. Then things started to go downhill.

The New Yorker’s article on Wikipedia was published July 31, 2006 and featured a prominent Wikipedia user known as Essjay. The New Yorker article said that Essjay has a Ph.D. in theology and a degree in canon law had contributed to sixteen thousand entries. Essjay served terms as chair of the Wikipedia mediation committee and edits articles patiently for errors and obscenities. This Essjay person sounds like your model Wikipedia contributor. Well, good for him, because he’s fake. Essjay is a persona developed by a 24-year-old named Ryan Jordan. In March 2007, Essjay’s persona was revealed, and Essjay retired from the site. Keen devotes two pages in his book to Essjay. Jimmy Wales, Wikipedia’s founder, is okay with it. He told The New Yorker that he regards the Essjay name as “a pseudonym and I don’t really have a problem with it.”

So where does that fall? In the realm of reliability, sure, Ryan Jordan’s contributions to Wikipedia were factually accurate. But what kind of authority do they have? Not much. Essjay’s degrees are made up, and have no basis, so they don’t have much authority.

The scary thing here is that we may be entering a period of time where informational authority doesn’t matter. This would be the real heartbreaker. If informational authority means nothing, then why am I working for my MFA degree? Why are people working for any sort of advanced degree? If Dr. Harvard Professor and Ryan from Kentucky are considered equally knowledgeable, what’s the point? And if that dividion is erased in fields of research and writing and journalism, what’s next? Will the guy operating on me be a real MD, or just some guy that knows a lot about surgery? Yes, it’s an extreme example, but think about it. What's information worth if authority is passé?

Thursday, July 26, 2007

headlines

A few links of interest that are somewhat related to this course, maybe:

A New York Times story about the Junie B. Jones books is here. Apparently, parents are just now discovering that these childrens' books have creative spelling and grammar, and they think kids shouldn't read that. Guess they shouldn't be reading anything else with incorrect grammar, like run-on sentences (there goes Faulkner) or phonetic spelling (bye bye, Mark Twain).

An interview on MediaBistro with Jesse Thorn is here. The answer to #6 is particularly interesting.

And my favorite entry for today is here. I like it so much I'm going to write a little commentary about it. Ready? Here we go.

In short, I am part of a wack-job demographic. A demographic of people that want flexibility and fun from work. A generation that has to be tutored not to text message senior partners at work to ask if they should wear a bra. Great.

But I don't think I'm one of these people.

Sure, I believe that there are people out there in the 18-24 demographic that are, for lack of a better word, spoiled. This is the generation that first had "helicopter parents." This is the generation that grew up with cell phones and Napster and doesn't believe in intellectual property rights. They call it Gen Y in the article and it makes me want to vomit. When the kid in the article says job interviews are for you to tell your potential boss what your requirements are, it makes me so mad. I'm not like this, and I don't want to be thought of as one of these people because I'm their age.

Not to brag, or whatever, but starting next month, I'll have a full-time job (where, in September, I'll probably work a 60+ hour week and most weekends), a part-time job (teaching at another college), plus two classes and regular freelance writing work, and a thesis to get cracking on. I take work very seriously. I'm on time. I try hard not to spend my workday sending text messages. I leave my iPod at home. I dress like I'm 35. I don't want to work at a place that would allow my coworkers to be what I'd call slackers. Apparently, that means in 15 years, I won't want to work anywhere.


I'm still reading the Keen book. I'm so mad I didn't find it before this class started or reviewing it would have been one of my assignments! Hmm, maybe if I can change my next assignment some, I can work it in. Just reading the first two chapters made me want to quit blogging and using Wikipedia. I've been blogging since 2000. Almost seven years. That's more than 1/3 of my life.

Wednesday, July 25, 2007

cult of the amateur, part one

My library hold was taking way too long, so I went to Barnes and Noble and bought Andrew Keen's book, and something telling happened while I was in the store.

At first, I couldn't find it. I scoured the "cultural studies" section (which was mislabeled, and what was labeled as "cultural studies" was actually authors A-G in "African-American studies"). I scoured the "current affairs" shelves and the "new nonfiction" area. Nope.

I go get in line at the information kiosk. The guy in front of me, a 14-15-year-old kid with his mom, was ordering Uncanny X-Men books. Does he not know you can order anything you want from the Internet? And they bring it to your house? And usually it's cheaper? Guess not. Maybe the information age hasn't quite reached everyone yet. I wonder if that kid has a MySpace page.

The guy helping him is probably my age, and at oldest he's maybe 27. He finishes with the other guy and motions me up to the counter. I'm hoarse from this cold/sinus infection I've had for 10 days so I have to lean in to be heard. "I'm looking for a book by Andrew Keen, K-E-E-N." "What's it called?" he asks. I tell him. He types it into his book-locater computer.

"Let's go take a look in our Science and Technology section." And there it was, on the bottom shelf, three hardback copies under the "Chemistry for Dummies" selection. He picks up a copy and hands it to me.

"If you read it, come back and let me know what you thought of it. I just wrote a review," he said.

"Oh. I read that this guy was a has-been that's kind of wrong and really full of himself," I said. I mean, I had read that. It's not really what I think, but I wanted to see what kind of reaction I'd get out of this guy.

"Maybe you're not old enough," he says, and purposefully walks away. I hope he meant he thought I was a teenager, and that I was part of the Googlegeneration, with a Wikipedian world view that pays no mind to intellectual property rights or editorial hierarchy. I hope he meant that he thought I was so young I couldn't be helped (I mean, I guess I look younger than I am, and it probably didn't help that I was there in the middle of the day with a cold in jeans and a t-shirt). I hope he didn't mean that I looked stupid.


Anyway, I've read the first chapter and he's already said a lot of things I've said here in this blog, plus some other things. He's mentioned the monkeys with typewriters and the dogs on the Internet. He's talked about Huxley and Big Brother and Facebook and MySpace all in one chapter. I wonder what the rest of the book could be? I mean, it feels like he's already covered all the bases.

I'll keep you posted.

Oh, and while I was typing this, the library called. They have the Keen book for me. Crap.

Tuesday, July 24, 2007

Revised television article

I'm sick today. So instead of reading, I worked on the second draft of my television article, which is here, for your enjoyment. I'll be back to reading tomorrow.

Hopefully, the library will call me about that Keen book too!


Dude, where’s my TV show?
How ratings miscount television viewers, no matter how they’re watching


In the 2006-2007 television season, the much-hyped program “Studio 60 on the Sunset Strip” premiered on NBC at #22. This means that for that week (Sept. 18), it was the 22nd most-watched regularly-scheduled television broadcast. Almost 13 million people watched it; roughly the same as the population of Illinois. For 2006, it was the No. 1 program recorded on digital recording systems (like TiVo) for “timeshifted” viewing. Three months later, however, in December, NBC announced the show would share its timeslot with another show, and in February 2007, announced that the last episode would air mid-month. Why the sudden change? How did a fairly popular show go from 13 million viewers to cancellation? One way: ratings.

From 1928 (when the first experimental television images were broadcast) until 1951 (when Nielsen started publishing ratings), television operated without much real, concrete knowledge of who was watching. In 1951, Nielsen announced that the Texaco Star Theater, a variety show featuring star Milton Berle, was the most-watched program on television. Why was this data important? Well, aside from the curiosity of broadcasters, it satisfied a need for advertisers. Broadcasters could more effectively price their advertising space. If more people were watching Texaco Star Theater than anything else, then advertising on that show should cost more. Ever wonder why commercials on the Super Bowl are so expensive? Super Bowl broadcasts are some of the most-watched telecasts. In fact, out of the top-20 prime-time telecasts of all time, half are Super Bowls. In 1982, more than 40 million households watched Super Bowl XVI, and a 30-second commercial cost $324,000. The prices are set by analyzing previous ratings, projecting how many people will be watching when the commercial airs and then charging accordingly. And the shows that nobody’s watching? Well, they get cancelled, since the network can’t sell advertising on shows without audiences. It really is a popularity contest. But how to they determine which shows are being watched?

In the beginning, the system was simple. Nielsen sent you a diary and you wrote down what you watched. Nielsen would send diaries to enough people to produce a statistically accurate sample to estimate the nationwide counts. But, in spite of its simplicity, there were numerous problems with this system. People didn’t send the diaries back on time, or didn’t fill them out at all, or lied when they filled them out. Why did they lie? Sometimes to make themselves seem smarter (it’s been published that people say they watch a lot more PBS than they really do), or to try to protect certain shows from being cancelled. In 2004, Nielsen came under fire for testing “Peoplemeters,” special remote controls each household member used to track demographic use, and a special box that sent all the information to Nielsen. You couldn’t lie and you couldn’t cheat, but somehow, the Peoplemeters were giving drastically different information than the diaries were, and most of the changes involved television shows with minority casts. The Peoplemeters were scrapped after testing revealed their flaws, and Nielsen has gone back to the (also flawed) diaries. The bottom line is that the ratings numbers aren’t right. They’re not only susceptible to fibbing viewers and bad counting, but samples are only measured for home viewing. Plenty of televisions are watched in airports, bars and restaurants, and those viewers aren’t accounted for. And what about all the rest of the ways we watch television programs, like on DVD or the Internet? Can they affect a show’s fate? Why, yes, they can … at least sometimes.

The animated comedy show “Family Guy” proved that networks are paying attention to the DVDs we buy. “Family Guy” was cancelled twice, in 2000 and 2002, but DVD sales figures and high syndication ratings convinced the FOX network to bring it back in 2005. According to USA Today, this was the first time a show was reinstated after cancellation due to DVD popularity. So, at least the networks are paying attention to what we buy, but how about what we rent? Or what we download? The simple answer is that they aren’t. As evidenced by the fact that “Studio 60” was the No. 1 “timeshifted” (that means TiVo-ed) program of 2006, but was still cancelled. The answer is simple: you can skip the commercials on TiVo, so it doesn’t matter to advertisers if the viewers are counted. Why pay for advertising on a show most people don’t watch ads on?

Some networks are capitalizing on the “timeshifting” idea and profiting by showing streaming video of shows on their Web sites and charging for advertising there. For example, you can watch the season finale of “Studio 60” on NBC.com, brought to you by the T-Mobile Sidekick. The episode is split into five short streaming video segments, each preceded by a commercial for T-Mobile. According to a recent article in TV Week, total advertising revenue for streaming video like this should hit $1.31 billion this year. It’s clear that advertisers know who’s watching, or they wouldn’t be paying more than a billion dollars for advertising on streaming video. The networks are counting too. NBC announced that it had delivered more than 300 million streams this season alone. It still isn’t enough to save some shows. Shows with steep production costs (like “Studio 60,” or another NBC hit “ER”) have to make up their costs in ad revenue, and sometimes they just can’t. Spending $40 on a DVD box set is one thing, but paying $1.99 to download a single missed episode from iTunes just won’t add up the same way. And there are no profits at all for illegally-downloaded episodes. Even though you can build a strong and dedicated fan base that way, to the networks, it’s just more lost revenue.

The good news is that, according to Nielsen, new methods are being developed to measure what is being watched on portable media devices like iPods and cell phones. This is called “placeshifting,” Nielsen plans to study ways to accurately track not just what people are downloading to watch on these devices, but actually what they are watching. One of the many potential problems here, which would affect more than just one show, is there’s no accidental viewing. For instance, when you’re watching a television show on television, there’s a chance you can accidentally happen upon another show (the show before or after what you’re watching, or on a nearby channel). You might find your new favorite show this way, or at least something else to watch. With downloading, however, you only download the shows you want, so there’s less of a chance you’ll ever try anything new. Which only adds to the possibility of new shows being cancelled from low ratings, because people won’t download them on a whim. Of course, there are many more potential problems, and I’m sure the Nielsen researchers will try to overcome them, but it just might not be enough.

Of course, there are still the old-fashioned ways of trying to protect your favorite show from cancellation: letter-writing campaigns and petitions, but they aren’t working anymore. Letters and petitions aren’t dollars, and when there are so many media competing for our eyes and ears, money is the only thing that talks. In the 50s, the original Nielsen company didn’t have to deal with cable television options, satellite radio or the Internet stealing away their television viewers. Even show quality doesn’t necessarily matter. “Studio 60” was just nominated for five Emmy awards (in casting, directing, cinematography, and two for guest actors), but even that won’t save it from cancellation. Sadly, shows that aren’t winning Emmy awards are consistently rated at the top. For 2006, the top three most-watched shows were Tuesday and Wednesday’s “American Idol” and “Dancing with the Stars.” It seems like quality is being sacrificed left and right for high viewership numbers, and it’s all about the money. Being a Nielsen test subject and watching the show is the only way to make a difference, Just like always, you’ve got to put your money where your mouth is.

Monday, July 23, 2007

Hard reading

Today, I read a pretty difficult essay. Partially it was difficult because the book it’s in uses 8-point type, and no pictures or graphs. My new-media-spoiled brain craves interactivity and visuals. Or maybe I’m just lazy.

The first, McKenzie Wark’s “The Weird Global Media Event and the Tactical Intellectual,” is complicated. It’s about the strange paradoxes in the concept of “Global Media” and “Global Media Events.” Wark uses the events of Sept. 11, 2001 to illustrate most of her points. These events were global media events, and how they’ve played out in the media (and are still playing out). A few of her most interesting points are:

Wark writes “It proves remarkably difficult to think back from one’s experience to the causes of the event itself.” She explains that even in The New Yorker magazine, where some of the “most distinguished writers in town” were charged with recording their experiences produced “banal” results. The writers, including “stars” like Jonathan Franzen and Adam Gopnik provide “richly detailed versions of their whereabouts on the day, connected to nothing but trivial remarks about the more abstract forces at work.”

Is this paradox a product of the technosphere’s influence? We have the capability to be more connected to more people and more information than ever before, yet, when “Global Media Events” happen, we stake a claim on our individuality by writing about where we were and what we were doing when we found out. This makes me think of something Chuck Klosterman wrote. In his book “Killing Yourself to Live,” he writes "When people want to go into detail about what they were doing on 9/11] "You have to listen, because that person is actually trying to show you that they can talk about life without the safety of ironic distance. September 11 is one issue every American can be completely earnest and unguarded about." When I read this for the first time, in 2005, I was thinking about it in context. It’s in a chapter about a Radiohead record, I think. But now I wonder if there’s some technological voodoo at work here. This sort of relates to what I think is Wark’s second interesting point:

Theydom and Wedom. This comes from John Hartley, and Slavoj Zizek and Edward Said. The “they” in Theydom is the other. “They” are not like “us.” Zizek explains that we don’t like “them” because “they” either want “to steal our enjoyment (by ruining our way of life) and/or has access to some secret, perverse enjoyment.” But now “Theydom” and “Wedom” has started a war. Isn’t this also opposed to the “Global Village” mindset? If we can contact these people and access information about them from the comfort of our laptop computers, does that make us more or less likely to blow them up? Wark says it like this, “The frightening paradox of September 11 is how this attack on actual human lives in New York and Afghanistan is at the same time merely an attack on abstract signifiers of Wedom and Theydom. This is an interesting, albeit scary, point.

I guess both of these points kind of make an argument for the futility of the “Global Village” nomenclature. Things, at least things like this, haven’t really changed. We aren’t really a village at all.

Tomorrow, hopefully, I’m going to read another hard essay, “Imperceptible Perceptions in Our Technological Modernity” by Arvind Rajagopal.

In other news, over the weekend I watched all three “Back to the Future” movies. In the fictitious 2015, there were fax machines and hoverboards and videophones and flying cars and “Jaws 19,” but there wasn’t an Internet. I don’t know if I would have even noticed that if I hadn’t been working on this course.

Thursday, July 19, 2007

Social Networking essay - Revised

Our Networks, Ourselves

Two years ago, a student I work with told me something terrifying. A recruiter at a job fair had told her she’d never get a job without a MySpace page. She wanted to know how I got a job without a MySpace page. I told her the truth: connections. I knew the person that hired me. I’d worked with him before.

“So, you networked,” she said.

“Um… yeah, I guess.”

She explained that this is what MySpace was for: networking, meeting people and getting jobs. I guess I didn’t get it then, and I still don’t. I don’t get the “networking” value of social networking sites. Social networking sites have given me something valuable: a replacement for my outdated address book, where the entries update themselves and I can always find who I’m looking for. They haven’t connected me with my dream job, or my future soul mate, or even any new friends, so why are they such a big deal?

People with titles like “ethnographic sociological researcher” are trying their best to figure it out, but most of us are out of the loop and behind the times. For example, Danah Boyd just published an essay on class issues in social networking Web sites. In her essay, she gives the history, and shows how it set up the class struggle. In 2003-2004, MySpace went from being a 20-30-something site to a teen site that helped bands promote themselves. In 2004, Facebook started at Harvard University. She doesn’t really touch on the why though. Why are Facebook and MySpace so popular? Before them, Friendster and other social networking sites could show you who you knew, and who they know so you can connect with new people (that’s what networking is, right?). But MySpace and Facebook took off because they had plenty of space for the users to talk about themselves. The theory is that you can list your favorite hobbies, or movies, or books on your profile and make friends with people with the same interests. It doesn’t work that way in practice though. Most people just write about themselves. These sites are called “Social Networking Sites,” but very little social networking really goes around. I can list my friends and you can list yours, and if we have friends in common, it will tell us.

But how is that really changing anything? Are we meeting new people? Are we finding people with similar interests? We can find them, but then we aren’t really making the effort to get to know them. And aren’t these things we could do before the Internet? Meeting people is pretty much the same. You have to make the effort. But now, we can make the effort in a passive way, and a much more creative way. We aren’t exactly tied to the physical and “real-life” attributes of ourselves, and can craft a persona better equipped to “social network” on the Internet.

I don’t know if this is true for everyone, but I’m not exactly the same person on the Internet that I am in real life. It’s not like I’m really a middle-aged man from Indiana in real life, but I’m less polished in person than I am on the Web. I’m sure everyone fibs a little bit, even those who say they aren’t fibbing. The Internet affords one luxury of communication you don’t have in person: time. I have time to cleverly concoct and assemble my lists of likes and dislikes for my social networking profile. It’s like writing your résumé, only it’s not to compete for a job, it’s to compete in the hierarchy of hip. You get to work on it for months, but your interview might only last 10 minutes. Even though I’m not any smarter, I can sound smarter on the Internet.

But it’s not all for the better. If people were trying to network with me based on my interests, though, they might be out of luck. Sure, my Facebook profile may list my favorite movie, but you know what? I lied. My favorite movie is “Caddyshack”; the profile says it’s “The Royal Tenenbaums.” “Tenenbaums” is one of my favorite movies, so it’s not an entire untruth, but it changes my entire online persona. Stoned frat boys like “Caddyshack,” but cool people like “Tenenbaums.” Did I do this consciously? Kind of. I thought of the movie that would give me something in common with the people I know. I didn’t pick the movie because it would help me get a job, or help me meet new people. I didn’t pick the movie that would help me “network.” I picked the movie that would prove I already knew the people I was supposed to know. And I picked the movie that would make me look cool.

But was it the Internet that influenced the decision to fib? Kind of. If I was at a dinner party with my hip, educated friends, I probably wouldn’t admit to be the type of philistine that enjoyed the story of golfers and an overzealous gopher. At a sports bar with my college-football-watching friends, I wouldn’t admit to being the hip poseur that likes “The Royal Tenenbaums.” Everybody fibs. Everybody bends the truth to their advantage, whether it’s in your job interview or your résumé or your MySpace. On your résumé though, it’s to get a better job, on your social networking profile, it’s just to be cool. It’s like that saying; On the Internet, nobody knows you’re a dog. On the Internet, nobody knows you’re not cool. On the Internet, you can cover up all your insecurities. In Boyd’s research, she found that popularity of these sites peaked with teenagers. Isn’t this when you’re feeling the most insecure about yourself? Isn’t it nice to be able to go on the Web and make yourself cool?

And that’s the draw. That’s the attraction. You can talk endlessly about yourself, and make yourself so cool. And that’s also exactly why these sites aren’t promoting networking — social or employment or otherwise. People are all too busy making themselves look cool, and showing their intelligent and fascinating interests and hobbies, great photos of themselves where they look thin and pretty, and listing how many friends they have. The features that made MySpace and Facebook popular (photo sharing, detailed personal profile pages, etc.) are the exact things that are preventing people from meeting anybody new, from actually “networking.”

So really, the social networking site is the new Narcissus’ pond. I see how it can be so addictive to some people like plastic surgery and makeup and fashion are addictive. It’s just not for me, I guess. I’m content to wait it out, and meet people the old-fashioned way, network the old-fashioned way and even get a job the old-fashioned way. For everyone’s sake, let’s try not to drown ourselves.

Web Politics

I think I get an email a day from Barack Obama, or his people, or his e-mail minion David Plouffe (when you Google search Mr. Plouffe you get the phrase "one of the most experienced and successful strategists in the Democratic party." Apparently, his strategy now is to e-mail me all the time and blog. Mr. Plouffe's blog is just one of the many ways politicians are trying to use the web to attract young, democratic voters. Good luck to them, because it didn't really work last time. I just re-read Stephen Elliott's book "Looking Forward To It," a Hunter-S.-Thompson-esque romp through the 2004 democratic primaries. It includes such salacious tales as being on a bus with Howard Dean and eating buffets with second-rate media correspondents. Its point, however, is that no matter what weird, wacky things you do, you will not win an election on new voters. Period.

It isn't stopping the candidates from trying though. Yesterday, the kind-of-political (and well-named) site BarelyPolitical.com (possibly not safe to open at work) crashed due to traffic for the "Obama Girl vs. Giuliani Girl" video. Their clothes stay on, and they sing-debate. Of course, it's also on YouTube.

Since I started typing this entry (a whole six minutes ago) I've been trying to think of a word to describe YouTube. It's weird. Where else can you go to watch :30 second clips of dogs on skateboards and teenage girls talking? (Well, I guess you could go to Google Videos.) But, right now on YouTube, you can watch a lot of videos about politics. You can watch more than 26,700 by searching for "Obama." And next Monday, you can watch a debate. Specifically, the CNN-YouTube debate coming to you live from Charleston, S.C. And there's CitizenTube, the YouTube political vlog.

Here's the issue Elliott proved in his book. Yes, you can draw potential voters in with your snazzy bells and whistles aimed at them. You can get them to donate money. You can get them to buy a t-shirt with your candidate's face on it. But in this bells-and-whistles society, that doesn't always translate into getting them to actually drive to some weird municipal building with total strangers and actually vote for you. Maybe this election will be different. Maybe this will be the turning point. The presidential elections are only 219 years old, which is practical infantcy as far as governments go, so maybe we can enter our rebellious teenage years and people 18-24 will actually vote.

The "Obama Girl" video has more than 2 million views on YouTube. In South Carolina, where this CNN-YouTube debate will be held, there are about 2.5 million registered voters, and only 45% of them voted in the last election. Does this mean viewership of the "Obama Girl" video could sway the electoral votes of a whole state? By the numbers, sure, but in practice, probably not. Definitely not. No way. In the 2004 election, 64% of eligible voters nationwide voted (way to bring down the average, S.C.). While I'm sure getting the other 36% to vote for you in the next election would definitely sway things your way, getting the 64% of people that actually have a proven track record of voting might be a better gamble.

So who has a proven track record of voting? Old people. Citizens age 65 and up have a 79% turnout rate! People 45 and up have a 70% turnout rate. Your voter turnout rate also increases the more education you have. Of course, you also have to consider who old, overeducated people vote for: republicans.

While I guess bombarding YouTube and e-mail boxes and other hip new media with your candidate can't hurt you, time will tell if it can actually help you.

Wednesday, July 18, 2007

Keen on it

I found Andrew Keen's book and put a library hold on it. Hopefully I'll get it this week and have some interesting responses next week!

Harry Potter and the e-book pirates

Let me preface this by saying I've never read any of the Harry Potter books. Not because I'm a philistine or a snob, but becuase I don't read much fiction, barely any fantasy fiction, and haven't read a book about children since I was a child. It just doesn't interest me. It's not because I'm heartless.

Anyway, the sixth Harry Potter book comes out on Saturday. And, like many media products with devoted fan bases, will have many 12:01 a.m. releases around the country. But it's too late. The thing's online already. And the publisher spent about $20 million to keep it secret. Money well spent? Hmmm.

So what happened? Well, someone got a hold of it, took a photo of every page and uploaded it on a bit torrent page. Then someone else scanned it in, put it through some optical character recognition software, proofread and then put it up. Who are these people? It's simple. They're fans. And mostly, the same dedicated fans who will line up Friday night at bookstores around the country to fight for their right to take home the book and stay up all night reading it. How come the publisher didn't decide, hey, maybe this is the time to release an official e-book? How come e-books aren't nearly as popular as downloading music? I think the answer is human conditioning on intake methods.

I was born in 1981. I grew up watching television on a television, watching movies in a movie theater (or on Betamax at my rich friends' houses), listening to music on the radio or on a cassette deck or record player, and reading books, you know, out of actual books. After 25 years, really only one of these methods has changed drastically, and another has changed mildly. First, I still watch movies in movie theaters, but a DVD player has eclipsed the Betamax. And second, I listen to music on my computer mostly, but sometimes still on a record player. Most importantly, I still read actual paper-and-ink books.

Partially, I think, it's because I have easily-tired eyes, and reading on a computer screen for long periods of time makes me sleepy. Second, maybe, it's because I sit at a computer for 9 hours a day, 5 days a week and when I get home, I just don't want to sit at a computer anymore. But overall, it's conditioning. I learned to read books. If I were to read the Harry Potter book, I'd be buying it in a bookstore (or on Powells.com). But I guess, if I were desperate to read it, I might download it, print it out and read the pages that way. I feel like an official e-book could have made the Harry Potter publishers a lot of money, could have alleviated the leaking problem and could have said to fans "We appreciate you," instead of saying "We don't trust you."

News coverage of this event: LA Times, Boston Globe, Salon.com.

Thursday, July 12, 2007

New media and television essay: DRAFT

Dude, where’s my TV show?
How ratings miscount television viewers, no matter how they’re watching


In the 2006-2007 television season, the much-hyped program “Studio 60 on the Sunset Strip” premiered on NBC at #22. This means that for that week (Sept. 18), it was the 22nd most-watched regularly-scheduled television broadcast. Almost 13 million people watched it; roughly the same as the population of Illinois. For 2006, it was the No. 1 program recorded on digital recording systems (like TiVo) for “timeshifted” viewing. Three months later, however, in December, NBC announced the show would share its timeslot with another show, and in February 2007, announced that the last episode would air mid-month. Why the sudden change? How did a fairly popular show go from 13 million viewers to cancellation? One way: ratings.
From 1928 (when the first experimental television images were broadcast) until 1951 (when Nielsen started publishing ratings), television operated without much real, concrete knowledge of who was watching. In 1951, Nielsen announced that the Texaco Star Theater, a variety show featuring star Milton Berle, was the most-watched program on television. Why was this data important? Well, aside from the curiosity of broadcasters, it satisfied a need for advertisers. Broadcasters could more effectively price their advertising space. If more people were watching Texaco Star Theater than anything else, then advertising on that show should cost more. Ever wonder why commercials on the Super Bowl are so expensive? Super Bowl broadcasts are some of the most-watched telecasts. In fact, out of the top20 prime-time telecasts of all time, half are Super Bowls. In 1982, more than 40 million households watched Super Bowl XVI, and a 30-second commercial cost $324,000. The prices are set by analyzing previous ratings, projecting how many people will be watching when the commercial airs and then charging accordingly. It really is a popularity contest. But how to they determine which shows are being watched?
In the beginning, it was simple. Nielsen sent you a diary and you wrote down what you watched. Nielsen would send diaries to enough people to produce a statistically accurate sample to estimate the nationwide counts. But there were numerous problems with this system. People didn’t send the diaries back on time, or didn’t fill them out at all, or even lied when they filled them out. Why did they lie? Sometimes to make themselves seem smarter (it’s been published that people say they watch a lot more PBS than they really do), or to try to protect certain shows from being cancelled. In 2004, Nielsen came under fire for testing “Peoplemeters,” special remote controls each household member used to track demographic use, and a special box that sent all the information to Nielsen. You couldn’t lie and you couldn’t cheat, but somehow, the Peoplemeters were giving drastically different information than the diaries were, and most of the changes involved television shows with minority casts. So, the Nielsen rating system is flawed. Not only may it produce wrong numbers, be susceptible to fibbing viewers, and only samples home viewing (plenty of televisions are watched in airports, bars and restaurants). But what about all the rest of the ways we watch television programs? Can they affect a show’s fate? Why, yes, they can … at least sometimes.
The animated comedy show “Family Guy” proved that networks are paying attention to the DVDs we buy. “Family Guy” was cancelled twice, in 2000 and 2002, but DVD sales figures and high syndication ratings convinced the FOX network to bring it back in 2005. According to USA Today, this was the first time a show was reinstated after cancellation due to DVD popularity. So, at least the networks are paying attention to what we buy, but how about what we rent? Or what we download? The simple answer is that they aren’t. As evidenced by the fact that “Studio 60” was the No. 1 “timeshifted” (that means TiVo-ed) program of 2006, but was still cancelled. The answer is simple: you can fast forward through ads on TiVo. Why pay for advertising on a show most people don’t watch ads on?
Some networks are capitalizing on the “timeshifting” idea and profiting by showing streaming video of shows on their Web sites and charging for advertising there. On NBC.com, for example, you can watch the season finale of “Studio 60” brought to you by the T-Mobile Sidekick. The episode is split into five short video segments, each preceded by a T-Mobile ad. And the network is counting who’s watching. According to a recent article in TV Week, advertising revenue for streaming video should hit $1.31 billion this year. NBC announced that it had delivered more than 300 million streams this season alone. But it isn’t enough to save some shows. Shows with steep production costs (like “Studio 60,” or another NBC hit “ER”) have to make up the costs, and sometimes they just can’t. Spending $40 on DVDs is one thing, but paying $1.99 to download a single missed episode from iTunes just doesn’t add up the same way. And there’s no profits at all for illegally-downloaded episodes. Even though you can build a strong and dedicated fan base that way, there’s lost revenue.
The long answer though, is that processes are in development to measure portable media devices like iPods and cell phones. They call this “placeshifting,” and plan to study ways to accurately track them. The problem here is there’s no accidental viewing. For instance, when you’re watching a television show on television, there’s a chance you can accidentally happen upon another show (the show coming on after or before the show you’re watching, or on a nearby channel while you’re flipping during commercials). You might find your new favorite show this way, or at least something else to watch. But with downloading, you only pick the shows you want, so there’s less of a chance you’ll ever try anything new. Which only adds to the possibility of new shows being cancelled from low ratings, because people won’t download them on a whim.
Of course, there are still the old-fashioned ways of trying to protect your favorite show from cancellation: letter-writing campaigns and petitions, but they aren’t working anymore. Being a Nielsen test subject and watching the show is the only way to make a difference, Just like always, you’ve got to put your money where your mouth is.

Tuesday, July 10, 2007

the web giveth

So, today, on BoingBoing (which I read through the Google Reader rss application), Clay Shirkey, a blogger at Corante (which also hosts one of my favorite blogs, Copyfight), called this guy Andrew Keen, whom I had never heard of, a Luddite. Well, kind of. He accused him of having Luddite views.

So now I'm researching Mr. Keen. He wrote a book, which came out last month, called "The Cult of the Amateur." From the reviews I've read, notably, this one in The Sunday Times, Keen basically says that the Internet is adept at flooding the intellectual market with low-quality information and entertainment. He likens it to the idea of monkeys on typewriters, but there's no chance of even the infinite number of monkeys out there creating anything brilliant. He argues that the old media did a good job of picking out what was good or important and making it popular.

Shirkey says Keen's argument is Luddite because Keen isn't anti-technology, he's anti-technology-replacing-the-way-we-used-to-do-things. This is Luddite because, well, it's what the Luddites thought. Shirkey explains that in 1811, a group of weavers broke mechanical looms. Their issue was that these new mechanical weaving looms made it cheaper to make fabric, which hurt their profits. They were counter-revolutionaries. Of course, we all know what happened to the weaving industry, as I'm sure nothing I'm wearing right now was woven by hand.

So far, the most interesting idea for me is this bit from the Sunday Times' review:

Even if they had the talent and the connections, no blogs could afford to conduct investigations comparable to the great newspaper campaigns of the past. So the idea that content on the web is “free” is mistaken: the hidden cost may be the demise of old media and entire art forms on which the free content depends.

Not to toot my own horn, but earlier in this blog, I mentioned that possibly, we were paying the price of humanity by using the Internet for simple tasks. Are we willing to pay these prices? I need to read Keen's book and get back to you. Hopefully, I can get it from the local public library today.

Monday, July 9, 2007

the iPhone vs. the iPhone buzz

I didn't buy an iPhone last week. However, I read so many articles about them, I felt like I did. The front page of the Saturday (June 30) issue of the Myrtle Beach Sun News business section even had an article picturing people in Charleston, S.C. (a 2.5-hour drive away), waiting in line for the iPhone. They looked vaguely similar to the people I'd seen in newspaper photos waiting for the iPhone in New York, L.A., and everywhere else hip enough to have their very own Apple store.

I recognize the power Apple holds over us, but after reading a bunch of articles and books about poverty in America, it's kind of hard for me to plunk down $599 for a telephone. And I actually wouldn't be plunking it. I'd be charging it to my medium-high-interest credit card, where it and months of other purchases wait to actually be paid for.

But I have read some very interesting things about the iPhone, and some very interesting things about the iPhone buzz. David Pogue, at the NY Times says it "matches most of its hype." Pogue also said, in another article, "With its new iPhone, Apple pulled off two masterful feats: creating the machine and creating the buzz around it." This reminds me so much of so many technical advancements in new media and other technologies in the past few years. So many things are announced with so much fanfare, but when the product or the technology is available, it just doesn't live up to the hype. Ten years ago, the Internet would change our lives. The only thing it's exceptionally good at though, is selling me things.

But the creation of hype, of buzz, of hope, is really what a lot of industry is actually about. Getting people excited about things is big business, and being able to get people excited about new media is hot big business. Maybe that's the popularity of Apple. They can really back up their claims. Sure, the iPhone took about 3 years from rumor to product, but they really nailed it when they did ship it.

Buzz though, is an interesting creature. "Viral" marketing has tried to sell buzz, but nothing can really replace word of mouth. Nothing can replace magazine and newspaper articles discussing your product. Nothing can replace news segments on television. Advertising that can't be paid for, the "buzz," is the most valuable. But the buzz is fickle. Burn the buzz once and the buzz won't take kindly to you next time. You don't even get the shame-on-me second chance to burn it again. And the buzz-savvy consumer is very intelligent. He or she can always tell the difference between real buzz and paid-for hype.

How have buzz (the un-buyable word-of-mouth marketing) and hype (the paid-for marketing) changed since the shift to the Web? Well, you can say they've just both found more audience members. You can barely vist a Web site without seeing an ad for something. I can't even check my gmail without seeing an ad for a shoe store. I can't check the weather online without seeing an ad for refinancing my mortgage. I can't check my email without getting offers for drugs. But the buzz has been infiltrated. People are paid, well-paid, to sit around and type up ads on messageboards and forums. Posing as regular users, these sentries infiltrate and try to scam you into buying things. The savvy consumers see right through it, and the buzz is protected. It has an even bigger audience too. No matter what product you want to discuss, there's probably a place on the Web to do it. Your real life friends aren't into talking about the iPhone? That's okay, just find a million new friends on the Web that want to nit-pick every nuance. The buzz has grown, because people can talk to more people.

What will be the long-term effects of new media on buzz? We'll have to wait and see. Short term though, I haven't bought an iPhone, but I'm eager to try one out in person. If that's all the hold the buzz has over me, maybe it needs to work harder.

Thursday, July 5, 2007

Commentary Essay: DRAFT

Two years ago, a student I work with told me something terrifying. A recruiter at a job fair had told her she’d never get a job without a MySpace page. She wanted to know how I got a job without a MySpace page. I told her the truth: connections. I knew the person that hired me. I’d worked with him before.

“So, you networked,” she said.

“Um… yeah, I guess.”

She explained that this is what MySpace was for: networking. I guess I didn’t get it. And I still don’t. I don’t get the networking value of social networking sites. Social networking sites have given me something valuable: a replacement for my outdated address book, where the entries update themselves and I can always find who I’m looking for. They haven’t replaced much else though, so why are they such a big deal?

Danah Boyd, an ethnographic researcher, just published her essay on class issues in social networking Web sites. First, though, she gives the history. In 2003-2004, MySpace went from being a 20-30-something site to a teen site that helped bands promote themselves. In 2004, Facebook started at Harvard University. Before them, Friendster and other social networking sites could show you who you knew, and who they know (that’s what networking is, right?). MySpace and Facebook took off because they had plenty of space for the users to talk about themselves. The theory is that you can list your favorite hobbies, or movies, or books on your profile and make friends with people with the same interests. It doesn’t work that way in practice though. Most people just write about themselves. These sites are called “Social Networking Sites,” but very little social networking really goes around. I can list my friends and you can list yours, and if we have friends in common, it will tell us. But how is that really changing anything? Are we meeting new people? Are we finding people with similar interests? Aren’t these things we could do before the Internet? And was it better then or is it better now? It’s pretty much the same, only now, we’re not always sure who we’re meeting.

I don’t know if this is true for everyone, but I’m not exactly the same person on the Internet that I am in real life. It’s not like I’m really a middle-aged man from Indiana in real life, but I’m less polished in person than I am on the Web. The Internet affords one luxury of communication you don’t have in person: time. I have time to cleverly concoct and assemble my lists of likes and dislikes for my social networking profile. It’s like writing your resume. You get to work on it for months, and your interview might last 10 minutes. Even though I’m not any smarter, I sound smarter on the Internet. But if people were trying to network with me based on my interests, they might be out of luck. Sure, my Facebook profile may list my favorite movie, but you know what? I lied. My favorite movie is “Caddyshack”; the profile says it’s “The Royal Tenenbaums.” “Tenenbaums” is one of my favorite movies, so it’s not an entire untruth, but it changes my entire online persona. Stoned frat boys like “Caddyshack,” but cool people like “Tenenbaums.” Did I do this consciously? Kind of. I thought of the movie that would give me something in common with the people I know. But was it the Internet that influenced the decision to fib? Kind of. If I was at a dinner party with my hip, educated friends, I probably wouldn’t admit to be the type of philistine that enjoyed the story of golfers and an overzealous gopher. At a sports bar with my college-football-watching friends, I wouldn’t admit to being the hip poseur that likes “The Royal Tenenbaums.” Everybody fibs. It’s like that saying; On the Internet, nobody knows you’re a dog. On the Internet, nobody knows you’re not cool.

And that’s the draw. That’s the attraction. You can talk endlessly about yourself, and make yourself so cool. And that’s also exactly why these sites aren’t promoting networking. People are all too busy making themselves look cool, and showing their intelligent and fascinating interests and hobbies, great photos of themselves where they look thin and pretty, and listing how many friends they have. The features that made MySpace and Facebook popular (photo sharing, detailed personal profile pages, etc.) are the exact things that are preventing people from meeting anybody new. The social networking site is the new Narcissus’ pond. Let’s try not to drown ourselves.

Wednesday, July 4, 2007

media journal

For a course I took last year, my book was Media Journal by Joseph Harris, Jay Rosen and Gary Calpas. I try to keep my books after courses are over, in case I need them again. Some I never kept because I knew I'd never keep them. I kept this one though, and I'm glad I did. Over the (very) long weekend, I found a few essays that went along well with the th, butemes in this course, and I'm going to post about each of them.

First, M. Kadi's essay "The Internet is Four Inches Tall," (reprinted from h2so4 issue 3), explores the concept of the Internet being somewhat like the Emperor wearing no clothes. It's this big, shiny new toy with plenty of possibilities, but the biggest one, at least now, is commercial possibility. First it's about the commercial situation of charging for the access. Then it explores what you're paying for. Back when this was written, in 1994, most of it was bulletin boards and chat rooms. Her final decision is that the Internet, while it has bells and whistles and speed, is still just another way of transporting information, which so far, isn't anything new.

I like this in relation to the Mythinformaiton essay. We aren't really connecting. We just have another tool to communicate, but it's not like we have new things to do. This really will play into my upcoming commentary essay on the pros and cons of social networking Web sites. Basically, through my research and experience, I'm compelled to say yes, we are somewhat cheating ourselves out of "human" experiences. And maybe someday, the amount of Facebook friends you have will change the way you live. I have 151 Facebook friends. I'm not on MySpace. But I'm not really friends with most of these people. For instance, only 2 or 3 could I call and ask to go to the movies with me. Or to help me move or anything else vaguely friend-like. I'm "networked" to them. I mean, it's a pretty good way to comprehensively list my acquaintances (or those that are on Facebook), but there's not much qualitative information there. Someone I've known and been close with almost 8 years is on the same level with someone I met once 3 years ago.

Ok, next essay. This one, by Andrew L. Shapiro, titled "Street Corners in Cyberspace" expands on the notion that the Internet provides a uniqueness that the user wants. "Don't like antiabortion activists, homeless people,news reports about murders? No problem–you need never encounter them," he writes. This is spot on. The Web is so customizable, and it's so easy to only get what you want. For instance, I personally subscribe to several RSS feeds. I have The New York Times arts section and BoingBoing. I also have The Onion, and the NYT front page. But it would be just as easy to ditch one and keep the other.

He uses two phrases I like: Cyberbia and Cyberkeley. These are the terms to juxtapose the Internet that's full of home shopping and advertisements vs. the Internet that's full of really intelligent information. But people in the middle, the street corners, are where there's a possibility for real growth and interesting developments on the Web.

There's a lot of interesting things out there, but I'm finding a lot of the same theories echoing over and over. I'm wondering if I'll ever find something different. Maybe it's trendy now, like it was a few years ago, to brag about hating television, or it being worthless. Maybe just right now, the trend is to discount the Internet as worthless. There are certain television shows right now, though, that prove there are dynamic, innovative and amazing minds out there creating them. Maybe it'll just take time for the Internet to come that way.

Tuesday, July 3, 2007

sorry

Sorry for the lack of posts. I was out of town with my family for a bit. They're still dealing with stuff about my granddad's estate. It's kind of a nightmare. But, as I am trying to keep out of personal stuff in this school blog, I will refrain from giving you the boring, inane details.

Regular posts will return tomorrow!

Thursday, June 28, 2007

part of the problem

Today’s reading:
The Tales They Tell in Cyber-Space by John Katz

First, let me say I hate the prefix “cyber” and I hate how it got attached to everything in the 90s (cyberspace, cybernews, cybersex) like a lower-case “i” got attached to everything in the 2000s (iWork, iLife, iPod). Since, though, this was written in 1994, for The New York Times, I’ll allow it. In 1994, prefixing stuff with “cyber” was undeniably hip. Isn’t it strange how even the trendy nomenclature changes? Cyber and virtual became Web 2.0, the current trendy way to say “interactive Internet features.” I digress.

Katz’s essay is about how creativity and communication have been affected by the Web. Instead of the power-communicators (TV networks, movie companies, etc.) choosing what gets communicated, the masses can discuss things directly with the masses. And though what they’re discussing lacks the polish of an editor’s hand or a newscaster’s inflection, it has something going for it: honesty. Katz excerpts posts from bulletin boards about emergency roadside births, homophobia and AIDS. These are sad stories, and funny stories, none of which are of the magnitude required to be a Movie of the Week or a novel, but they’re still true stories of real life, without the gatekeepers.

In journalism school, back in 1999-2002, we barely used e-mail or the Internet. It’s strange, really, to think about. I’m sure other departments on campus were using e-mail (computer science, definitely, and other sciences I bet, probably the business school too). I don’t know if journalists as a whole didn’t realize the possibility that we’d be obsolete someday. So much effort is put in now for “convergence” newsgathering. It’s not enough to get an interview and write a story, you have to have photos, video, interactive graphics on the Web and a thousand other bells and whistles to make your story reach as many people as possible.

That’s harder too. Since people get their news from the Web, it’s easier to pick and choose. You can read the New York Times art pages without reading the front page. You can watch a clip on CNN.com and not have to watch the actual cable channel. You can watch clips on NBC.com and never see an entire Nightly News with Brian Williams newscast. Now, this doesn't seem so bad when you think about excising stories about celebrities or pop culture from your news diet, but what if you’re excising something “important?” What if you stopped reading any news about the war? Or any news about global warming? And is that really any worse than cutting out stories about Paris Hilton?

A well-balanced news-diet is a thing of the past. Gone are the days where people read a morning paper, an evening paper and watch the local and national news every night. The only people I know that do this anymore are over 65-years-old or journalism students. I don’t even do that anymore. I might watch an hour of CNN, read the New York Times RSS feed for the front-page headlines and read a magazine every day though. But these are choices I make, not necessarily to shield myself from certain stories, but to read what I like to read.

This is bad. I’m part of the people making myself obsolete. How terrifying.

Tuesday, June 26, 2007

Social Networking and class issues

So, I'm starting to research my commentary essay on social networking, and apparently, a new study released recently shows class distinctions between different types of social networking sites.
According to this article Facebook is for rich kids that are going to college and MySpace is for kids that aren't, all according to this paper.

This is huge.

She basically says there's a class war going on with these sites, and that the aesthetics (another writer compared Facebook to the look and feel of an Ikea store and MySpace to the look and feel of Las Vegas) are going to become part of it. Sure, your typical MySpace page does often look like a giant war of pixels and sound, and your typical Facebook page is cleaner, but these are reflecting class distinctions? Like Ikea and Wal-Mart?

I just started reading the actual paper, but so far, it's very interesting. This woman knows her stuff. I can tell already this article will directly influence my commentary essay!

She also cites this: The Lost Children of Rockdale County, a PBS Frontline special about a 1996 syphillis outbreak in Rockdale County, Ga. I remember seeing this on PBS about a hundred times when I was in high school (in the next county over). It shows how bored, middle-and-upper-middle-class white teenagers can really get into trouble. (Consequently, when this was on PBS for pretty much five years straight, my friends and I would joke that when you turned on PBS, you either got kids with syphillis or Wishbone, a children's television program about a dog that wore hats and was commonly mistaken for historical figures. Our joke was "Is that a kid with syphillis? Nope, just a dog in a hat.")