I watch a lot of movies, and I've got a question about something I see all the time. I'll be watching along, enjoying myself, until a certain kind of character shows up. This person has bad hair and dresses one step above homeless. They're socially maladjusted and sexually frustrated. They're sarcastic, smart and bitter, and probably addicted to some non-glamorous drug like nasal spray or Chapstick. The odds they suffer from a nervous GI disorder—usually IBS—are good enough that no sane man should take that bet.
I start cringing right away. "Wait for it, wait for it," I'm thinking, and then someone onscreen asks this character, "So what do you do?"
And this poor sucker answers, "I'm a writer." And the audience nods knowingly, chuckles and "gets it." Of course they're a writer.
I was complaining about this not too long ago, and my son made an observation. "Uh, yeah, but Dad, a writer wrote this movie. So why do writers always portray themselves as losers?"
Indeed. Who else gets to control their image so completely? Athletes, politicians, movie stars ... none of them get to control what gets said about them. We writers literally control the levers of our whole culture. There's an entire industry of public relations people dedicated to influencing what writers say and think. So when it comes to writing about ourselves, you'd think that all fictional writers would all be taut, fresh-faced, nubile, morally flawless, and dentally perfect geniuses. Instead, we get Charlie Kaufmann and Liz Lemon.
So I'm curious: Why do writers always portray themselves as the kind of schlub you warn your kids away from?