If you think that shows are getting better, then how about reality shows where people's worse traits are paraded on television? I absolutely hate those shows. Sure, you have some shows that try to be "intellectual" and have decent story lines, but in general, the ones that get publicized the most are shows that don't really have much substance. Take America's Next Top Model for instance. What does it teach us? It teaches us that women should be thin, tall, and beautiful. It teaches the next generation of girls to be superficial and to only look at the outside. Then you end up with a bunch of girls trying to mimic top models. I thought magazines did enough of this propaganda, but apparently that isn't enough, television has to do its share as well.
Other shows like Survivor and Amazing Race do a great job at portraying people's worse sides. If you ever sit down and watch those shows, it's always about people arguing, fighting, swearing, and in the end, what's their motivation? To win money.
The whole distinct storyline is becoming more apparent, but what I do not get is that despite shows getting better and probably using better language, the majority of the population still does not talk that way! Maybe people just aren't absorbing the best parts of television shows. That's rather disappointing.
I may be biased, and probably am, but The West Wing was probably the best show ever. If you want to talk about strong plot and distinct story line, it had it all. And if you wanted to learn how to speak with proper and better English, that show was the right fit for you. It also showcased a lot of societal and political issues facing America. For me, no show has been able to really match up to it. Perhaps its successor is Studio 60 on the Sunset Strip. We shall see.