Category: Social Commentary


Music, Consensus, and Rock Greats

With rock greats like the Beatles and the Rolling Stones dominating lists of the best musicians of the past century, I’ve often come to wonder who will remain relevant in the decades from now. The other day, I read an old article written by journalist and author Chuck Klosterman about comedian and late night TV legend Johnny Carson after the celebrity’s death. He said that Carson was a funny person, but his true significance and importance lies in the fact that he was basically the last great cultural icon.

The connection between these two ideas may seem far-fetched, but trust me, they are more related than you’d think. I think what Klosterman was getting at is that there was once this sense of a shared, collective culture and a perceived consensus, probably because of limited media sources.  As he said, “There will never again be cultural knowledge that everybody shares, mostly because there is just too much culture.”

Back in the ‘60s, we are led to believe that everyone in America was watching when the Beatles performed “I Want To Hold Your Hand” on the Ed Sullivan Show. This is probably somewhat likely. Back in the days of limited television channels and definitely no internet in sight, the outlets through which popular music was transmitted were limited, at least compared to today. Magazines such as Rolling Stone would put rock stars like Jim Morrison on its cover and tell us that he was worth caring about. The majority seems to have listened.

Today, we have an infinite number of choices, which is an idea that Klosterman also draws on. With blogs, social media sites and the like telling us about this band and that band, this obscure singer and that under-the-radar rapper, it’s hard to keep up. I’m not saying that there aren’t cultural figures that many of us seem to know and agree on and that there were in the past. After all, I think that the passage of time definitely explains why many of us can stand back and say that Jimi Hendrix was an amazing guitarist or that Led Zeppelin led the way for many metal bands that were to follow.

I think, however, that the idea of a consensus is harder to believe when two-way communication dominates our interactions with the media. Today, there are millions of blogs and sites like Pitchfork or Live Music Guide devoted solely to music and reviews – individual listeners can argue back against mainstream expressions. So what if many magazines named Kanye West’s recent album the best of 2010? We don’t have to listen. With a click of the mouse and flickering of our fingers across the keyboard, we can tell the world that we disagree or agree or are completely confused or that, frankly, we could care less. The media is still telling us what to believe, but we can just as easily drown their messages out with our own opinions and voices. Now, we are the media.

It’s hard to compare the current cultural and music environment of our time to that of the ‘60s and ‘70s. It can be argued that rock ‘n’ roll basically dominated much of those decades, basking in its newness and unfamiliarity. Today, the assortment of genres and subgenres that receive radio airplay have vastly increased – hip hop and pop are more likely to reach number one on top 40 charts than rock. In that vein, however, we don’t even need to listen to the radio. We have Pandora, GrooveShark, etc. We can choose what we want to listen to and when we want to listen to it.

Perhaps these are some reasons why the idea of the “greats of yesteryear” seems unattainable for our generation. To some extent I have always believed there was somewhat of a consensus about great musicians of all time, even if I personally disagreed with the choices. For example, even if I used to be a little less than crazy about Queen (don’t worry – I have since reformed), I feel like I understood their place in rock cultural history. I knew better than to expect that every single person liked every single popular band, but still. So, here’s the question: years from now will a list of the great rockers of right now (the future past, if you know what I mean) develop? Or is our cultural identity becoming less and less homogeneous by the day, to the extent that a consensus will be as ancient as cassette players? It may be a little clichéd, but only time will tell.

-Margot Pien

Favorite Music-Centric Movies

Today, I offer a run-down of some of my favorite music-centered movies. My humble list doesn’t include popular films and documentaries like the Beatles’ “Hard Day’s Night” or the Martin Scorsese flick about Bob Dylan, “No Direction Home.” However, I think these movies show the interesting blend of music in films, whether through fictional bands or semi-biographical retellings.

“Almost Famous” (2000)

Cameron Crowe’s flick about a teenage journalist who follows a rock band called Stillwater on the road during the ‘70s has resonated in the hearts and minds of many. With the unforgettable scene in which the bus full of musicians, managers and friends burst into Elton John’s “Tiny Dancer,” the movie has cemented itself into pop culture history (did anyone catch that Super Bowl ad?). Apparently based on Crowe’s own adventures hanging with the Allman Brothers Band, the film has the perfect mix of an amazing soundtrack, lovable characters and skilled acting. Patrick Fugit’s portrayal of William, the teen who is eager to become pals with his rocker friends, and Kate Hudson’s role as Penny Lane, who leads a group of self-professed “band-aids” (not groupies, she claims), are both memorable.

“That Thing You Do” (1996)

I remember at a young age seeing the film that tells the story of the quick rise and fall of a rock band in the ‘60s. The poppy tracks that the fictional group – “The Wonders” – played caught my ear, and I remember loving the vintage nostalgia of the costumes and sets. As the small-town musicians rocket from obscurity to billboard-hit fame, tensions among members inevitably rise and success is short lived. Catch winning performances from Tom Hanks as the savvy manager, Tom Everett Scott as the sunglasses-sporting drummer and Liv Tyler as the spurned girlfriend of the egocentric lead singer.

“Blues Brothers” (1980)

Building on the success of a SNL skit by cast members Dan Aykroyd and John Belushi, the comedians took their act to the big screen to make the hilarious “Blues Brothers” film. After Jake (Belushi) springs from a stint in jail, the brothers go on a mission to reassemble their old band, win a competition and save the Catholic home where the siblings were raised. My favorite scene has to be when the band, which dabbles in – you guessed it – blues and rock ‘n’ roll, book a gig at a country-western bar. After initially playing their usual material and being booed, the band switches to a rendition of the “Rawhide” theme and a Tammy Wynette hit called “Stand by Your Man”.

“I’m Not There” (2007)

Surprisingly, this is the only film on my list that profiles a real-life musician. This ambitious semi-biographical film takes a look at various stages of Bob Dylan’s life, along with different facets of his personality and music. A share of actors portrays Dylan during various scenes, settings and interpretations; the list includes Heath Ledger, Richard Gere, and Cate Blanchett among others. The soundtrack is almost a character of its own, with a range of musicians covering some of Dylan’s best tracks. Some of my favorites are Roger McGuinn (of The Byrds) and Calexico’s “One More Cup of Coffee,” Cat Power’s “Stuck Inside of Mobile with the Memphis Blues Again” and Stephen Malkmus (of Pavement) and The Million Dollar Bashers’ “Ballad of a Thin Man.”

“Flight of the Conchords” (2007-2009)

Ok, so I know this isn’t a movie, but I can’t leave out the unforgettable HBO show about a duo hailing from New Zealand. The show follows Bret and Jemaine as they go about their daily lives, trying to book gigs and penning hilarious songs along the way. The comedians/band members blurred the lines between fiction and reality, releasing two albums and going on tour. It’s often hard to listen to their songs with a straight face, and they often parody different musical styles. Some of the best include the rap battle of “Hiphopopatamus vs. Rhymenoceros,” the song praising a girl for her mediocre looks in “Most Beautiful Girl in the Room,” and the tribute to fashion and hipsters, “Fashion is Danger.”

-Margot Pien

Horror Films Make Me Laugh

Another sector of the American media that continually befuddles me is — you guessed it — horror flicks. I guess one part of it is that I’ve never really understood the point of them. To scare you? I don’t like to be scared — is that such a weird thing to admit aloud? It seems like a common sense thing to me. “That movie was so good, I had nightmares!” …Right.

But it’s a moot point. I don’t find anything to scare me in most American horror films anyways. The only horrifying thing about them is their storylines (or lack thereof). No, that’s not true; the acting may be worse.

But as a writer, I automatically have the worst possible perspective on horror films. Find me a horror film that isn’t riddled with poorly-woven plots and unrealistic characters, and I’ll find you one without a saggy romance stuffed full of cheesy dialogue. Horror films are by nature sensationalist; they play to your emotions first and answer questions later (or not at all); and I cannot condone such behavior in storytelling.

The horror genre also falls into extremely predictable ruts. I feel like a seer every time I watch a horror flick because I can tell five minutes in advance when a scary, jump-out-of-your-seat moment is coming. The eternal army of netherworldly characters — zombies, vampires, werewolves, ghosts and so on — has lost its fear factor, as far as I’m concerned. Originality is hard to come by with horror flicks, but that doesn’t stop producers from squeezing out contrived crowd-pleasers year after year.

Consider another genre: war movies. Some would say that war movies are all the same, stoic stereotypes leading their motley but brave troops into battles in which one or all of them will die. True, war movies may all have similar plot lines, but at least they have good messages (as long as that message is “war is hell”), and their subjects deserve some respect. Plus, war actually happened. Horror films, on the other hand, have no point other than immediate entertainment.

“Serial killer” horror movies have some redeeming value because at least their subjects really exist (and thus are more frightening but, wouldn’t you know it, that’s not a turn-on for me). True, not all horror films are equally awful, but I can’t lie when I say that I notice similar trends in every horror flick.

I won’t say I’ve never been afraid or jumped up during a scary movie — “The Sixth Sense” scared the living poop out of me when I was little, and yes, I will jump maybe once or twice sometimes if a horror film is any good whatsoever. But I think scary music is the one thing that really rattles me – and quick, high-pitched sounds can make anyone jump.

But as a rule, I’m done sitting there shaking helplessly during horror movies. Instead, I laugh at them. I admit, it can be pretty funny trying to predict what jump is around the next corner, or what uninspired character will die next. I just can’t take it seriously anymore.

– Tim Freer

Okay, so this year has been pretty tough on movies. We had a long series of summer flops saved only by the box-office success of “Inception,” and the fall-winter line up didn’t look to be much more impressive. I was just about to hand in the towel and give up on 2010 as a year for movies when The Social Network came along. I’ll admit it, I was more than a little skeptical that the “Facebook” movie was already here, but from all accounts, it was supposed to be a good movie.

Well, all accounts were wrong.  The Social Network isn’t a good movie; it’s a great movie, and possibly one of the most culturally relevant films to come out in the past several years. The movie follows Mark Zuckerberg, the creator of Facebook, from his days at Harvard trying to get into a Final Club through the creation of Facebook, its success and ultimately his legal battles with those who helped get him there, including his best friend and co-founder, Eduardo Saverin.

The movie moves quickly and keeps audiences captivated with some of the wittiest dialogue I’ve heard in years and a solid use of flashback/flashforwards as narrative devices.  The acting was solid and the film boasts one of the most impressive scores ever. The boating crew race, set to a rock adaptation of “In the Hall of the Mountain King,” is quite possibly one of the best scenes I have ever seen and will make any film fan geek out with joy.

The movie tracks not only the path of Zuckerberg as he becomes the youngest billionaire in the world, but also follows the growing social impact of Facebook itself. Let’s face it, it’s difficult to remember a time before Facebook came around. If you’re a senior, you were a sophomore in high school when it began, freshmen were still in middle school, and it has changed the way people, especially college students, socialize.

Social Networking used to be limited to personal blogs and other sites such as MySpace; Facebook, however, took the whole game to a new level and now dominates not only the social networking scene, but the entire internet. Google recently released a list of the top 13 websites visited in 2010. Facebook was No. 1 with over 570 billion page visits, consuming more than 35 percent of all Internet use in the United States. The No. 2 website, Yahoo, received 70 billion page visits – a mere 12 percent of Facebook’s traffic.

Anyone who has a Facebook account should definitely make a point to go see this movie because whether you would like to admit it or not, this movie is in part about you and how one little website created by a Harvard computer science geek changed the way you share your lives with your friends, family and the world.

Also, buy the soundtrack. It’s amazing.

– Samantha Ryan

Commercials are Porridge

Before I begin, let me state publicly that I know how important advertising is to our economy. If you’re trying to attract business, you have to attract crowds — the larger the better, of course; hence, why advertisements are crammed on every open wall and website across this fair nation of ours. We’re a capitalist nation. I get that.

But does anyone else find it the least bit insulting how mind-numbing commercials are these days? Let me be the first to admit that I am not materialistic by nature; I don’t look to commercials to be told what I want.

All the same, I find it difficult to see how people can overlook the constant spew of demographic-seeking, controversy-censored porridge that all our most memorable advertisements slop in the bowl for us.  That people are actually inspired to go buy these products is often beyond me.

I’ll admit it.  The Geico commercials amused me for awhile.  The Emerald Nuts puns of yesteryear were entertaining while they lasted.  But on a different level, I acknowledge that for what it’s worth, these are little more than crafty 20-second salesmen, only as sincere as their white polished smiles.

Padma Lakshmi for Hardee's

All latent stereotypes, overused jokes, and inexplicable celebrity references aside, advertising is supposed to sell a product for what it is. I have no qualms with straightforward, boring commercials that tell consumers up front what so-and-so’s services/products are and why they are better. At least they are honest.

No, it’s the sexy, curvy stereotype holding the hamburger (think Hardee’s), the dumbed-down joke that I’ve heard in five different movies (think cheap beer), the constant correlation between any product and being cool (think pretty much anything), that really bugs me. It bugs me because these ads affect people on levels beneath what they see on their screen, makes them associate material goods with self esteem. And the worst part of it is that all these ploys work so well.

– Tim Freer

It’s been 50 years since the female oral contraceptive pill hit the market, the anniversary of which occurred Sunday, May 9, Mother’s Day. Today, there’s little argument about the widespread social and cultural effects of the “pill.”

The pill allowed women of childbearing age at least some control over their reproductive life, and it provided women the opportunity to seek fulfilling careers by delaying childbirth to a time of their own choosing.  But with widespread use came widespread debate.  Proponents of the pill felt empowered and to them, the tablet equaled freedom.  Opponents preaching chastity, especially those with religious affiliations, feared an increase in promiscuity and broken relationships.  Despite the arguments, it was obvious that the pill split sexuality and reproduction forever.

This past week, UNC researchers released early data showing the promising effects of using ultrasound as a male contraceptive.  The initial study, performed on rats, is a promising first look at how ultrasound waves can successfully and reversibly remove sperm from human male testes.

James Tsuruta, PhD, assistant professor at UNC’s Laboratories for Reproductive Biology and Paul Dayton, PhD, associate professor and director of graduate studies in the Department of Biomedical Engineering, led the study.  The Bill & Melinda Gates Foundation donated $100,000 to the research study.

Of course, most people know ultrasound machines are used for fetal imaging.  It’s a complex process, but the general concept is simple.  Ultrasound waves are produced essentially like SONAR on a submarine.  Imagine yourself in a deep canyon.  You call out, “Hello, there!”  Your sound bounces off the canyon walls and returns, “Hello, there!” as an echo.  Ultrasound works the same way.

If the research is finally approved by the FDA, doctors would use an ultrasound probe, or “wand”, moving the probe over the body part, in this case, a man’s testicles, to create an effect entirely different than medical imaging.  The procedure would most likely be painless to the patient; however, because sound waves move poorly through air, a special lubricant (even K-Y jelly) may need to be applied to the body part to allow speedy and effective travel of the waves.  At this early stage, I have no idea if this will be necessary because it sounds like the procedure is therapeutic in nature.

Inside the scrotum, testes house numerous tubes called seminiferous tubules.  These in turn, contain immature germ cells responsible for producing sperm.  By utilizing ultrasound waves, scientists were able to remove the immature germ cells, thereby eliminating the sperm.  Most interestingly, they were able to render the male infertile for up to six months! After such a time span, the tubules will receive new germ cells producing large amounts of sperm.  And, unlike the oral contraceptive, the ultrasound contraception is non-hormonal and as of right now, harmless.

In fact, most research shows ultrasound waves are inherently harmless as a tool for physiological imaging.  At the hospital that I work in, it’s generally accepted that ultrasound scans won’t harm patients.  But given the widespread opportunities that this study could create and the relatively unknown territory that scientists now find themselves in, further testing is warranted.

But this new research begs the question: Could we be seeing the beginning of a male sexual revolution?

Representatives from the Bill & Melinda Gates Foundation made it clear that they hope the use of ultrasound as male contraception can be used in the first and, more perhaps more importantly, the developing world.

The implications for the use of this discovery are widespread and could be tantamount to the empowerment of the pill to developing countries looking to keep the dangerous economic effects of overpopulation in check.  The use of contraceptives remains uncommon in many developing countries and men in particular would need to be well educated about the benefits of the technology for it to catch on.

To recap, it appears that an ultrasound contraceptive could provide men in first and developing countries with an easy (receiving an ultrasound requires little preparation and would be like a routine checkup), effective “out” over a period of six months that is, at this point, harmless.  These routine ultrasound scans, if obtained at the office of a general practitioner, could also provide men a good excuse to receive full-body checkups, increasing the chance of detecting early signs of disease, including prostate cancer. Ultrasound machines are also relatively inexpensive, especially if they are refurbished.  This should be useful in providing the therapy to developing countries, often the recipients of second-hand imaging and medical equipment.  And the list goes on.

But what are the negative social and cultural effects that could follow the use of this discovery?  Could they be similar to the ones that followed the approval of the pill?  Would men around the world become more promiscuous, and are men in need of a sexual revolution in the first place?

And despite the ease of receiving these sterility treatments, could men be relied upon to obtain the treatments on time? Questioning a few of my female co-workers, the consensus is that they wouldn’t stop taking the pill out of mistrust of their male partners to remember to get the treatment.

So, what do you think?

– Jonathan Michels

The Upfront Series is a series of blog posts leading up to the eponymous annual May ritual conducted by the television networks. Part 1 investigates the fall-out of the WB/UPN merger in 2006.

Since it’s almost April, it’s really not too early to be looking ahead to May Upfronts. Of course, by that logic, it’s really not too early to be looking ahead to exams, summer school, internships, vacations and Harry Potter Status Day on Facebook (May 3rd. Be there.) But I digress. What are Upfronts, you ask? Why, let me explain.

Every year in the middle week of May, the television network executives gather in New York City to unveil their new fall programming schedules. Often, this is the time that primetime shows find out whether they are renewed, getting the boot, or moving to a new timeslot.

For a television fanatic like yours truly, Upfronts are a wild, emotional ride.

Some years are more exciting than others. Take, for example, 2006, the year that the WB (owned by Warner Brothers) announced its merger with UPN (owned by CBS), to create the CW. I don’t know how many of you were emotionally attached to shows on UPN – I know I was addicted to Kristen Bell’s feisty “Veronica Mars” (yeah, that’s right, K. Bell played a teen sleuth before we all forgot her as Sarah Marshall). But I was more enamored with what the WB had to offer – at the time, “Gilmore Girls,” “Supernatural,” “7th Heaven,” “Everwood,” “One Tree Hill,” “What I Like About You,” and “Reba” are the ones that stuck out most.

More importantly, though, the WB had concocted a legacy of what it means to be a teen drama. “Buffy the Vampire Slayer,” “Felicity,” “Angel,” “Dawson’s Creek,” “Roswell,” “Gilmore Girls,” “Charmed” and “Popular” have done more for network television than they’ve ever been given credit for.

2006-2007: Fan Favorites?

Fans of each of these shows worried whether they would make the cut to the CW. Who knew what UPN President Dawn Ostroff would turn her nose up at when she took over as head of the CW?

So fans rallied the way that fanatics of the Aughts often do: they created online campaigns, pelted the WB’s mail slots with physical mementos of their shows and bought DVDs in hordes. For the most part, it worked. Ostroff was too chicken – and still is – to experiment and create television that viewers may actually enjoy. She opted to keep every single WB show except “Everwood”; she renewed the already-cancelled “7th Heaven”; accepted “All of Us,” “Everybody Hates Chris” and “Girlfriends” – the most popular UPN comedies – into the CW clan; and strung along UPN ratings-darlings “America’s Next Top Model” and “Friday Night Smackdown.”

One new show debuted on the CW: “Runaway,” a drab family drama with an even drabber premise. “Runaway” made it to episode 3 before cancellation. I have a saying I picked up from a friend in high school: I’ll try anything twice. Usually, this means I’ll give any show at least two episodes before deciding whether to stick with it or dump it.

I turned “Runaway” off halfway through the pilot and never looked back.

2007-2008: One Year Later

The 2007-2008 season was far more interesting for the CW.

“Gilmore Girls” had ended a season too late. “Veronica Mars” had been cancelled a season too early. New comedy “Aliens in America” looked promising and comedic, in the vein of “Everybody Hates Chris.” “Reaper” looked halfway decent, a potentially modest hit that could find a cult audience. And book-adaptation “Gossip Girl” was garnering a lot of buzz.

“Gossip Girl,” of course, became one of the most talked-about shows of the season, due to its so-called racy sexual risks. I know Josh Schwartz, the man behind “The O.C.,” also created “Gossip Girl,” but I find “The O.C.” to be much more daring than “Gossip.” Obviously, Fox aired the four seasons of “The O.C.,” and Fox has less-strict standards than the C Dub. But by christening “Gossip” as this daringly bold show when it was stringently obvious that its predecessor had already broken that ground, Ostroff came off as desperate for viewers.

Well then again, she had reason to be. After one year, her network had half the audience the WB did. On the WB, “Supernatural” hit ratings high in January 2006 with 6 million viewers and averaged about 3.81 for the season. Now, “Supernatural” flailed around the 3.14 million mark, a loss of nearly 20%. Worse, it slipped from ranking #165 for its first season, out of every other primetime show, to #216 for its second season. The other shows were hardly faring any better.

“Gossip Girl” has only ever managed about 2 million viewers on average. It is frequently beat out by “Supernatural,” “Smallville” and “One Tree Hill.” “America’s Next Top Model,” averaged 5.4 million viewers in its spring 2007 cycle on the CW; by spring 2008, that number had declined over a million viewers to 4.23.

2008-2009: Ladies First

By the CW’s third year in existence, 2008-2009, it became clear that the low viewership meant the network’s best bet at eyeballs would be to become a niche network. In other words, Ostroff decided to target teenage girls as her dominant audience and hope/ the advertisers played along.

During the 2008 summer hiatus, sophomore series “Gossip Girl” took off as the epitome of the network’s creative ability. Marketers played up the sexual encouters, revealing clothing, unmoral attitudes of the 17-year-old protagonists. Their efforts got the attention of the Parents Council, who protested the “inappropriate” OMFG campaign.

Dawn Ostroff smiled. Protests meant attention.


Ostroff’s development fall slate accordingly echoed the network’s newfound ideals:

  • A poorly-implemented rip-off/reboot of “Beverly Hills, 90210”
  • “Privileged,” an escapist but fickle drama about two spoiled teenage girls
  • “Stylista,” a reality show in which fashion enthusiasts vied for an internship with Elle magazine

Reliable Thursday night rocks “Smallville” and “Supernatural” received far less marketing attention because as male-driven dramas, they did not jive with the network’s new façade. The fact that their ratings were greater than any of the female-targeted dramas was hastily overlooked.

So, what were the results of this new attempt at female-audience domination? Well, “Stylista” aired all 9 of its episodes, after which the CW declined to order more due to low ratings. The more promising “Privileged” limped through its freshman year before the CW axed it. And “90210” pulled in an average of 2.24 million viewers throughout its first season – a colossal bomb that would have been pulled after the first airing on any other network.

What did the CW do? Renew it. And decide to revive “Melrose Place” from the ashes.

2009-2010: Potential?

Closing in on the end of its fourth season, I think it’s okay to say “Melrose Place” is a bomb. But who really expected a success?

The one “creative” decision that has paid off for the CW is the book-to-screen adaptation of “Vampire Diaries.” Riding the coattails of the “Twilight” trend, the “Vampire Diaries” is actually the most successful show on the network right now, pulling in between 3.5-4 million viewers a week.

“Gossip Girl,” which still defines the netling, is struggling to meet the 2 million-viewer mark.

Recent midseason replacement “Life Unexpected” performs fairly well, sometimes crossing the 2 million mark or else settling comfortably into the audience level of its time slot co-habitant “One Tree Hill.”

“Supernatural,” which was meant to be a five-season run from its inception, has been renewed for a sixth year. Series creator and executive producer Eric Kripke stepped down when said announcement was made, handing over the reigns to Sera Gamble. “Supernatural” draws about 3.2 million viewers a week.

“Smallville” has been renewed for – who saw this coming? – it’s tenth season! There’s really no killing Superman. Still, the sci fi drama pulls in the viewers, despite being shafted into the Friday Night Death Slot this year.

“America’s Next Top Model” continues to struggle, but as long as the viewership remains above 3 million (a far cry from the 6 million it averaged in its first cycle in 2004), the CW drags its existence along.

One significant alteration the CW made to its schedule this year was to drop its half-hour comedy block. The CW now stands as the only broadcast network among the Big Five (CBS, NBC, ABC, Fox) to not air a single half-hour comedy. Even more importantly, though, this signals the decline of the “black sitcom.”

From 1995-2006, UPN served as an outlet for the black audience, while most networks were either too proud or too scared of failure to cook up a show starring black people. UPN’s greatest hits included: “All of Us,” “Eve,” “One on One,” “Girlfriends” and “Half & Half.”

Let’s play a little game, shall we? Let’s take a moment to count the number of television programs on network primetime today whose casts are primarily black.

Take a look at thefutoncritic’s Spring 2010 Primetime Grid and we can count together:

… I came up with zero. You?

Obviously, this is a big step backward for the minority trend on television. Think back to the 1970s and 1980s and early 1990s, when shows like “Sanford & Son,” “Diff’rent Strokes,” “The Cosby Show,” “The Jeffersons,” “Family Matters,” and even “The Bernie Mac Show” and “The Steve Harvey Show,” were popular with all audiences. You could go so far as to say “Cosby” and “Jeffersons” helped define pop culture of their eras. What’s going to define black television now? BET?

But I digress. The CW is no more to blame for the failure of minority television than any other network.

So what can we learn from all of this?

  1. The CW’s attempts to market itself primarily to the advertising demographic of 18-34 and 12-34 is a bust. Yes, the WB was successful at creating soulful dramas that attracted predominantly teenaged female audiences. But the WB didn’t target these audiences necessarily; they kept them in mind. That is a key difference.When the CW creates an original drama that has male leads, we will know they are taking a step in the right direction. Until then, I trust the network will continue churning out crap like “90210” and “Melrose Place.” Which brings me to my next point…
  2. Original thinkers are sparse at CW headquarters. How many shows are still leftovers from The WB and UPN? 4: “Supernatural,” “Smallville,” “America’s Next Top Model” and “One Tree Hill”How many shows were adapted from book series? 2: “Gossip Girl” and “Vampire Diaries”How many shows are reboots of old series? 2: “90210” and “Melrose Place”So that’s the network’s entire primetime schedule: leftovers from the WB/UPN, book adaptations and reboots.

    How many original drama series are on the network at the moment? 1: “Life Unexpected.”

    If the CW wants to follow the success of the WB, which I personally believe would be a great step for them to take, they need to think more originally. “Dawson’s Creek” was hardly the first teen drama on TV, but is still a cherished series today, more than a decade after it began airing, because its legacy is one that teenagers of any generation can relate to. It broke new ground in its protagonist, its storytelling, its love triangles, and most significantly, its characters. Which brings me to my next point…

  3. Characters. Want to know why I can no longer stand “Gossip Girl”? None of the characters are likeable. I grew tired of Blair’s selfish antics, which were fun for a season but monotonous and unimaginative two years later. I’m sick of “I’m Chuck Bass” and Serena’s bad decisions; of little Jenny’s popularity contest; of Rufus and Lily; and of Nate, even though I can’t remember what the hell his character even does for the show, which is bad enough.The CW has yet to create an impressionable original character for me. Lux on “Life Unexpected” has a shot at becoming memorable, but until the show improves upon its own plots – I feel like I’ve watched the same episode three times by the point of episode 5 – I have no encouragement to keep watching. None of the “90210” or “Melrose” girls are likeable, enviable, someone I’d want to be friends with or keep up with.I think characters should be someone you want to know in real life. Who on the CW do I want to know? Dean Winchester on “Supernatural” may just be the only one. (Gasp… a male?! Run, CW execs, run!)
  4. Audiences enjoy laughing. Yes, the CW’s attempts at comedy failed. But since they seem stuck on the hour-long format, why not give a shot at a dramedy? “Gilmore Girls” was equal parts drama and comedy, and look at how well that blend worked out.With the success of ABC’s “Modern Family” this season, all of the networks’ development slates include more comedy than they did a year ago. Take advantage, CW. People are willing to give the half-hour laugh fest another shot.
  5. Girls like to watch guys. This is not groundbreaking science, CW. Try creating some male-centered dramas, à la “Supernatural” and “Smallville.” Take a cue from the “Vampire Diaries” even. Just please stop throwing out nonsense like “90210” and “Melrose Place” without creating likeable characters and bitchy girls.

Well, there you have it. Tune in next week for more network analyses and 2010-2011 predictions from the Upfront Series.

– Sonya Chudgar

Why I love midterms

Midterms officially kicked into high gear this week, so I would like to take a moment and explain why I love them so freaking much:

  • Multiple hours of slaving away inside Davis Library. It’s industrious to know you’re studying in the same building where babies are made on the 8th floor.
  • Professors may deny their vanity over the course of the semester, but midterms unveil how vain they really are: They all try to one-up each other by scheduling as many midterms for you as possible in the same week. Sly dogs.
  • I lose sleep. The recommended amount is 7-9 hours a night, and I usually average 5-7. Midterms bring me down to 3-5 hours. But wait! This is a good thing! According to the economic law of marginal utility, the more you do something, the less satisfaction you receive from it, so less sleep = lower opportunity costs! Yay!
  • The amount of coffee consumption triples, due to all nighters and more studying on campus. I (sarcastically) told a friend a few weeks ago that I wanted to pick up a coffee habit, so… mission accomplished.
  • You have a greater chance of getting sick, since your mind and body are stressed so much. We have a pretty exciting epidemic to catch this year. A good “swine flu” story is so much more badass than an “I got a cold” story.
  • There’s very little space to study on campus. I especially love this, because I can then spend multiple hours wandering from Davis to the UL to Graham Memorial in an attempt to find an open study spot. From Graham Memorial, I often go to the Union. If the Union is full (it usually is), I’ll try to find an open classroom. But if it’s after 9, the buildings are locked. At this point, I check the Caribou Coffee on Franklin Street, but all the tables are taken there, too. Thus, I must cross the street and enter my nemesis, Starbucks, looking for an empty chair. Of course, there is none. I wander back onto campus, this time into Hanes Art Center, but I don’t know why I bother; there’s hardly any study room in this building, and oh yeah, it’s locked. So I get in my car and drive back to my apartment and go to sleep. Studying is exhausting.

-Sonya Chudgar

Have you Googled yourself today?

I realized I was finally growing up when I developed a new obsession. Gone are the days of outrageous shopping (I used to live at Nordstrom), constant TV watching (I totally forgot about the Grey’s Anatomy season premier ), and spending my Saturday afternoons enthralled in hours of College Football (even Carolina games are trying my patience – Hakeem Nicks, we need you back). Instead, I find myself constantly tweeting, looking for connections on LinkedIn, and (you guessed it) blogging.

I confess: A year ago, I didn’t even know what Twitter was. Even a few weeks ago, I didn’t have any real interest in blogging. But as of Tuesday, I am officially captivated. My name is Amy Dobrzynski, and I am a social media addict.

It all started when Kelly Giles (former President of Blue & White) came to talk to Carolina Public Relations Student Society of America. And it probably didn’t help that I was surrounded by seniors who were all starting to freak out about getting (or not getting) jobs. But when I learned that it’s not really your impressive resume that lands you the job, I was floored.

So, humor me for a second. Open up Google. Type in your name. What kind of results show up?

I am one of the lucky ones. There is only one Amy Dobrzynski out there, and you’re looking at her. My Google results are pretty straightforward (Twitter, LinkedIn, Facebook, my blog, various entries from The Daily Tar Heel and Blue & White, etc.). But I’m guessing most of you aren’t having the same luck. And chances are that your future employers are going to be Googling you as well.

So how do we change the lack of Google results and increase our chances of getting hired? By developing an online presence through social media.

Here are a few tips from an (albeit fairly new) social media enthusiast:

1. Facebook – keep it private. You are not going to stop your friends from posting inappropriate things on your wall, and you definitely don’t want to risk your employers seeing it. Facebook should be your social outlet, so let’s keep it that way. (Sidenote – University Career Services friended me on Facebook the other day. I didn’t even know they knew what that was.)

2. Twitter – I know we all love bashing Lindsay Lohan, but what does that say about you as a person, besides the fact that you are up-to-date on your celebrity gossip? Tweet about something interesting to you (that future employers will find interesting, too). Business major? ReTweet something from the Wall Street Journal. English enthusiast? Tweet about the great book you just read.

3. LinkedIn – You have tons of connections at your fingertips, and I bet you didn’t even know it. Have you had an internship? There’s a good chance your employers are on this networking site. I’ve even found a couple of teachers on LinkedIn. But make sure you actually know someone before you start going connection crazy. No one likes random friend requests on Facebook, and professionals definitely don’t like them on LinkedIn.

4. Blogs – This is your chance to show off your own personal style, so have some fun with this! Keep it professional, of course, but make sure you show off your creative side.

5. Stay classy – If you’re going to use pictures on any of these sites (and you should), make sure they represent you, and how classy you are. No red cups, please.

6. Start Early – I had a quarter-life crisis this summer. I turned 20 and could no longer use “but I’m a teenager!” as an excuse. Now, I’m a junior and I’m already freaking out about finding a job in this economy. Don’t wait until the last minute to get caught up on social media – you will only put more pressure on yourself!

7. Branding – Create a brand for yourself. For non-PR majors, a brand is something you use to market yourself to employers. Pick a few adjectives that describe you and then incorporate them into all of your social media outlets. Make sure you use a consistent name. Employers don’t have time to go searching around for you. If you need help with branding, UCS is having a Personal Branding seminar Sept. 29.

-Amy Dobrzynski

Bad blood

tru blood

Vampires. Love ‘em or hate ‘em, the beasts have sunken their fangs deep into pop culture and don’t intend to let go anytime soon. Vamps have been the “it” factor for more than a year now, but for me, vampire fatigue set in long ago. I’m tired of hearing about the overhyped “True Blood.” I deride The CW for launching yet another book-to-television revival in the form of “The Vampire Dairies.” And any mention of Robert Pattinson or Edward Cullen makes me want to chew my face off.

So how do we cope with pop culture failing us when we most need a new trend to talk about? Through mockery, of course. Here’s a rundown of eight ways to use the vamp trend to reinvigorate television.

Show Most in Need of a Vampire:

“Grey’s Anatomy”
This snoozer proved last season that it doesn’t mind scripting undead characters (hello, Denny). Ellen Pompeo’s whiny Meredith checks out early this season due to maternity leave, and Katherine Heigl recently announced she’ll skip five episodes to film the movie “Life As We Know It.” So what better time to throw a fanged villain into the mix? McBloody, anyone?

Runner up: “American Idol”

What you didn’t know was that Fox passed up this guy when they offered Ellen DeGeneres the chair of Paula Abdul:

count_von_count

Tough loss.

Show Most in Need of a Vampire Slayer:

“The Vampire Diaries”
For obvious reasons.

Runner up: “Bones”
It’s the perfect solution. Buffy slays Dr. Temperance Brennan, and then she and David Boreanaz’s Agent Booth Angel end up together at last.

Show with a Vampire that just needs to come out of the closet already:

“Gossip Girl”
I’m talking to you, Chuck Bass.

Runner up: “Lost”
Sawyer’s really an escapee from the “True Blood” clan. Just listen to his accent.

Network Most Likely to create a Vampire Crime Drama:

CBS
They already did it once: “Moonlight” (cancelled after just one season in 2008)

Runner Up: CBS
I hear “CSI: Forks” is already in the works.

Character Most Likely to Date a Vampire:

Sam Winchester, “Supernatural”
Sammy’s past hook-ups include Ruby, a demon who ultimately convinces him to open the doors to hell and thus jumpstart Armageddon; Madison, a girl who unknowingly morphs into a werewolf at night; and Jessica, Sam’s fiancée who was murdered by the same demon that killed his mother. Dating a vampire would be a tame choice for a change.

Runner up: Dwight Schrute, “The Office”
A) Dwight already believes in vampires (see season three “Business School” episode)
B) He is only 99 percent sure Ben Franklin is dead. 99 percent fact, 1 percent imagination.
C) He could defend himself against any possible attacks: He has the strength of a grown man and a little baby.

Character Most in Need of a Vampire Boyfriend/Girlfriend:

Vince Chase, “Entourage”
Anything to raise his faltering actor status, right?

Runner up: Annie Wilson, “90210”
Annie’s dumb. And this show is boring. It could use some “life.”

Character Most Likely to Kill a Vampire:

Jack Bauer, “24”
Terrorist or not, that bitch is going down.

Runner up: Echo, “Dollhouse”
Eliza Dushku’s Echo can be programmed to take on any persona and thus perform any action. So, obviously, the Dollhouse need only program Echo to become a vampire slayer, and Faith is back in action.

Show Most Likely to Introduce a Singing/Dancing Vampire:

“Dancing with the Stars”
Hey, if they’ll cast Tom DeLay, they’ll cast anybody.

Runner up: “Glee”
As long as it meets the criteria: It sings! It dances! It’s a misfit in high school!

-Sonya Chudgar