May 31, 2010

Unlike other toys, video games belong to a certain generation rather than age group

A new study by market analysts NPD shows that the average video game player is 32, up from 31 last year. That is a dramatic change from the Nintendo days when the audience was mostly pre-pubescent kids. Curious, I searched the NYT for "average age" and "video game," and it turns out that the average video game player is whatever age someone born around 1978 would be at the time of the survey, from 1990 to 2010. This is only for home console game systems like Nintendo, Sega Genesis, PlayStation 2, etc. Computer game players tend to be 5 to 10 years older, but still their average age is however old someone born in the late '60s or early '70s would be.

Contrast this with the age profile of people who play with action figures -- their average age stays virtually the same across the decades, somewhere between 5 and 13 I'd guess. If a boy is too young, he doesn't have the motor coordination to manipulate the action figures, and he'd probably get too scared by dragons, monsters, and so on. Once he hits puberty, though, "toys" in general are no longer cool and he turns instead to music and movies.

Thus, action figures stay fairly constant in their basic features, and individuals adopt them or junk them based on changes in their nature throughout the life's stages. Video games, however, adapt themselves to whatever the 1978-born game player would most enjoy; they follow after him throughout life's stages.

For instance, when Nintendo ruled the video game world, games focused mostly on imagination and exploration, true to what a 9 or 11 year-old boy would dig. (He was too young to play a bunch of terrible Atari games in the early '80s and become soured on video games forever as a result.) By the time he turns 15 in 1993, video game developers make games that allow him to pretend he's a badass in an adolescent way -- more cursing, more defiant attitude, getting into bloody fistfights like he wishes he could with the popular guys in high school, etc. Thus is born Mortal Kombat.

When he goes off to college at 18 and begins in earnest to try getting laid, video game makers release Tomb Raider. The protagonist is a butt-kicking babe whose reliable presence consoles him on those many dateless nights. He's also at the age now when he'd think about joining the army to go shoot a bunch of shit up, and so the first-person shooter games begin to proliferate on home consoles with 1997's Goldeneye. (Wolfenstein 3D and Doom were computer games, and recall that computer game players are 5 to 10 years older.)

Throughout the rest of his 20s he still has a taste for shooting people, joining gangs, and committing thrill-seeking crimes. This accounts for the explosion of first-person shooters and play-as-the-criminal games like those of the Grand Theft Auto series during most of the 2000s. For those 15-24 year-old males who prefer daydreaming, the cornucopia of role-playing games in the late '90s through the mid-2000s offered an escape.

Still, as he approaches and then passes 30, his adolescent / young adult taste for psychopathy begins to fade and he thinks about settling down. He starts to complain that "all first-person shooters are the same old boring game," that Grand Theft Auto IV wasn't all it was hyped up to be, and so on, to rationalize his own dwindling taste for psychopathy-oriented games. He now turns more to games that mimic the activities he'd be engaged in if he'd actually bothered to have children, such as go-kart racing, hunting and fishing, sports that can be played in a park, and guitar lessons, mostly available on the family-oriented Wii. And now that he's not young anymore, he tries to keep in shape with fitness "games" -- also on the Wii.

If he had had kids, this is the age during which he'd think about what toys and such to get them, so he naturally recalls the entertainment of his own childhood. Game developers respond by reviving previously dead franchises in the original style that he discovered them in, such as Megaman 9 and 10, or perhaps just made prettier by current technology, such as Bionic Commando Rearmed. An avalanche of classic games that he played as a 10 year-old are re-released for download on current systems, such as the Wii's Virtual Console.

Looking ahead, we can predict a rise in 2020 of golf simulation games, in the '30s of gardening-themed games, and from the '50s onward of virtual reality games where he gets sponge-bathed by large-breasted blonde nurses.

I can't think of another major form of entertainment -- billions of dollars a year in revenue -- that shifts its shapes to so closely match the tastes of a cohort that's at most 3 years wide (about 1977 to 1979). Movies, TV shows, popular music, all have niches that tag along with this cohort or that -- metal reunion tours that cater to increasingly older audiences -- but in no other case does the entire industry attend to a single generation, no matter how old they grow. As long as video game developers can adapt their products to the changing tastes of this cohort, they'll remain profitable. Still, they would become even more profitable if they diversified their audiences to the extent that other industries with niches do. They could still release games that suited the 1978-born game player, but they'd also have a steady schedule of imagination-based games for the large number of 9 year-olds who enter the market every year. Once more Nintendo appears to be leading the charge here with plenty of games on the Wii and especially on the handheld DS designed with children in mind.

Fundamentally video games are toys, and so the best video games will tend to be made when the actual audience for these toys is close to the ideal audience for toys, namely 5 to 13 year-old boys (not girls). Everyone who has experienced video games from the 1980s through today recognizes the period from the mid-'80s through the early-mid-'90s as the golden age. That includes Nintendo, Genesis, Super Nintendo, TurboGrafx, and the arcade games of the same period. The quality of the average video game, regardless of outliers, started to slide once video games were targeted at the audience of adolescent losers and then college-aged shut-ins. The average quality has only tumbled further as they've begun to appeal to men who should be raising a family. If this pattern continues, things will only get worse -- just think of what action figures would look like if the average owner was in his 40s. Worst. Toy. Ever.

May 28, 2010

Where does food poisoning really come from?

There's pressure on the USDA to monitor meat for more strains of E. coli than it currently does, lest there be a repeat of the Jack in the Box burger outbreak of 1993. But anyone who actually pays attention to what foods have been recalled for poisoning knows that meat contributes very little, and that this is just more animal-phobic balderdash from the government.

For the most recent year of available data, here is the CDC's press release about food poisoning across different types of food. Ignore the number of outbreaks, since that confuses the obvious difference between an "outbreak" of 3 people and an outbreak of 3,000 people; focus on the number of cases instead. Here is a one-page PDF with all the numbers so you can see how all 17 categories of food contributed.

The leading contributor is poultry, in particular baked poultry that was clearly not cooked long enough or at high enough temperature to kill bacteria. So much for red meat being the main killer, and so much for the superiority of baking to frying. When you bake chicken or turkey, you typically don't carve it up beforehand, so the surface area affected by the heat source is pretty small compared to when you cut up the chicken into pieces before throwing them in some oil in a skillet. And the primary infection that comes from undercooked poultry is C. perfringens, whose symptoms don't sound so bad as far as food poisoning goes.

Just about as common, though, were cases caused by leafy green vegetables and fruits/nuts -- and here we're talking E. coli and Salmonella, respectively. Combining the prevalence and severity of symptoms, plant foods were far more harmful to Americans than poultry was, let alone even more harmless sources like beef, dairy, and eggs. More recently, this month there was a major recall of romaine lettuce products containing E. coli, and the homepage at FoodSafety.gov currently displays a picture warning about Salmonella in alfalfa sprouts.

Were the spinach, peanuts, lettuce, and sprouts responsible for these outbreaks heated long enough and at high enough temperature to kill off bacteria? Of course not -- then they would no longer count as "fresh," the euphemism for "raw" in the context of non-animal vs. animal products. Eating deep-fried alfalfa sprouts would make you look like a weirdo, not the eco-friendly saint that you want to be.

As the numbers show, food poisoning is somewhat rare, with fewer than 1 in 10,000 people affected, although unrecognized cases would drive that estimate up. But if we do consider that rate and the average severity of symptoms to be bad enough that "something must be done," we should remember who the real culprits are. Contrary to vegetarian propaganda, plant foods -- especially those "fresh" kinds we're supposed to eat more of -- are the worst offenders.

May 27, 2010

Government welcomes Apple to big boys club with kick in the balls

Now that Steve Jobs' company is worth more than Bill Gates', the antitrust army has grown bored of lobbing bricks through Microsoft's windows and decided to poison Apple instead. (Here and here.)

Modern antitrust suits as a general rule are bogus; the companies under attack are typically responsible for falling prices and improving quality, all the way back to the first railroads and Standard Oil. They serve instead as a ritual to show whoever the fittest companies are that they'd better watch their behavior or else. In a market economy, this is unnecessary because firms that harm consumers will be driven out of business by competitors, and to the extent that these lawsuits tie up the companies' time, money, and manpower, they parasitize economic health.

But in a society ruled by competitive politics, government agents can only get away with what the public will allow, and will pursue what the median voter desires. If they don't, they'll get thrown out of office, or the elected official who appointed them or their party will be punished. The average person has so little faith in the ability of market competition to discipline how companies behave that they're willing to tolerate -- and often cheer on -- the antitrust "watchdogs" who only screw up the products that consumers love.

The price of going to the movies didn't get any cheaper when the government banned Hollywood studios from owning theaters in which to show their movies, and this is the source of the high price of popcorn and other concessions at the movies that consumers always complain about. Pretty soon it'll be harder or more expensive to get mp3s from iTunes, but people have such a deeply rooted fear of any big organization that they're willing to pay that higher price if it means someone gets to knock Apple around so that they don't get too big for their breeches.

May 25, 2010

Your first celebrity crush?

Looking through all 24 pages from this forum topic on who your first celebrity crush was, I'm struck by how few "high school sweetheart" types there have been in popular culture since the society became safer and duller after 1991. The earliest quasi-crush on an actress I had was for Stephanie from Full House, but she and I were too pre-teenish for it to last. My first head-over-heels crush was for, you guessed it --


God damn... now that's gonna leave a mark. What about yours?

Even though when they first became famous I didn't see many of the other girls from the '60s through the '80s listed in that forum topic, I can still appreciate them since they're all cut from the same mold. I hardly noticed her when I saw The Karate Kid as a small child, but if I had been older I totally would've fallen for the all-American Elisabeth Shue, barely 20 years old, springing around in those gym shorts and powerless to hold back her smiles. But then in the early-mid '90s the "it" girls start to get more frigid and annoying, masculinized in the face, and tending toward the butt-kicking babe of geek fantasy (now the mainstream guy has geek tastes). There are a couple exceptions, like that Alex Mack chick who my younger brother had a huge crush on, and later Jennifer Love Hewitt. But overall the fragile, energetic sweetheart who was born to make babies has faded from the cultural landscape.

It seems that in dangerous times, we're in hurry-it-up mode and want females who look like they want to have children at all, sooner rather than later, and more of them, and who appear faithful and nurturing. It's only during safe times, when our time horizon moves much farther away, that guys come to like bitchier and colder women whose careerism and mercenary attitude toward sex mean that you'll be lucky to squeeze any children out of her at all, and even then much later and not so many in number. If she's not that maternal and caring, meh so what, the world's safe enough that she won't need to protect them from that much anyway. In a safe world, kids basically raise themselves, so mommy has more free time to go beat up aliens.

It makes you feel bad for guys who were born after 1984 or so, since they got their first huge crushes (at age 9, let's say) around 1993 or after, when the girls available for them to imprint on were pretty bland, like Topanga from Boy Meets World, Pamela Anderson, and that chick from Resident Evil. We could test this by surveying guys of all ages and seeing how much variability in responses there is based on when you were born. If you came of age when there was an abundance of girls who naturally fascinate guys, your cohort will have lots of girls listed and each one will command a decent share of the cohort's affection. There might be 10 "it" girls each with roughly 10% of the share of affection. However, if you came of age when there were slim pickings, your cohort would latch onto the few pure specimens of femininity. There might only be 2 "it" girls, each with roughly 50% of the share of affection.

That looks like the real world, with the explosion of make-you-lose-sleep girls in the '70s and '80s, all of them about as popular as the others, compared to the tiny handful of category-dominating girls in the '90s and 2000s. The '60s didn't have too many either, but I attribute that to the under-developed state of mass pop culture at the time. If the markets for TV, movies, and pop music had been as mature as they became during the 1980s, I think the '60s would have had a lot more girls like Andy from The Goonies. Still, it wouldn't have been as great as during the '70s or '80s because the crime wave was only getting underway during the '60s and people had not totally converted to the "life is short, let's have fun" mindset.

May 24, 2010

Making sense of the hook-up culture

Most of the earnest, worried stories I read about "the hook-up culture" I immediately tune out. After all, these are stories about how supposedly common it is for teenagers or college students to hook up -- to engage in some unspecified level of sexual activity -- at a time when kissing, pawing, jerking, fingering, and fucking have all but gone the way of the drive-in movie theater, especially moving from one partner to another.

However, there's an article in the current New Yorker that I skimmed through and didn't see any such bogus claim about how prevalent promiscuity has become, but rather how strong or weak the emotional and romantic bonds are between those teenagers who are sexually active. (The author is not aware, though, that these are a shrinking minority of teenagers in general.) Her concern is over the impression that those who hook up aren't bound to each other psychologically, that they don't see anything magical or liminal in embarking on a sexual voyage, and that instead their mindset is more like, we might as well just get this outta the way, i mean it's not like it's a big deal or anythinggg.

I can see that -- the culture as a whole is a lot less romantic than it was during wild and dangerous times (roughly the late '50s through the very early '90s), when looming threats turn people's minds toward reproducing before something out there does them in. During safer times (from 1992 and counting) people feel secure enough to delay the various rites of passage that lead to baby-making. This fall in promiscuity and early sex is what shows up in the statistics I linked to above. BTW, the 2009 data will come out on June 3, and I'll formally predict that the promiscuity measures will show no significant increase from the 2007 data.

These data would make the worry-warts ecstatic, if they would bother to look them up -- "Oh thank god my little girl isn't messing around! And hey, whatever it takes to achieve these plummeting levels of promiscuity, let's do it." What they don't realize is that when people are less hormone-crazed, that has effects not only on how carefree -- or reckless -- they will behave, but also on how able they are to fall irrationally in love, to feel as though each act were a vehicle for leaving behind the world of the ordinary, and to want to suck all the juice out of each encounter rather than do it just to get it over with. (These differences show up even in porn made during safe vs. dangerous times.)

Anxiety-stricken parents and social workers could realize this by reflecting on when in life's stages is a person the most capable of forming an uncontrollable emotional bond to a sex partner -- you are most likely to fall head over heels in love when you are also the horniest, namely as an adolescent or young adult. The adorable type of love that older couples have is nowhere as intense, and they're probably glad that it isn't. They lost enough sleep and sweated enough buckets through their palms when they themselves where young. There's nothing cute and innocent about young love, though -- just remember how out-of-your-mind you were and how desperately you would have acted if that's what it took to win over that cute girl who sat across the aisle from you in science class.

I think this is also behind the perception that popular culture, at least music, has become skankier. The content is no less risque than it was during the '60s through the '80s, and female performers were wearing suggestive clothing since at least the '70s. Perhaps what people are sensing is the performer's lack of soul and mercenary attitude toward sex and her own body. And ditto male performers lacking soul and looking at women as strippers. Both trends come together in that terribly boring and undanceable song "My Humps" by the Black Eyed Peas. Back when people actually were promiscuous, pop songs such as "Like a Virgin" and "Little Red Corvette" showed girls who are eager to give in to their desires, rather than frigid women from whom sex can only be wrested through a financial transaction. Sex in those songs was something uniquely capable of sending the couple into a state of ecstasy -- not like someone's lunchtime routine of inserting a dollar bill into a vending machine, pecking at a keypad, and picking up the candy bar that gets dropped.

If the wild or tame strategy paid off more than the other no matter what, then one strategy would gradually replace the other. Instead, we see cycles from mostly wild to mostly tame and so on again. That must mean they pay roughly equal rewards on net, the wild strategy having dazzling benefits but greater costs, and the tame strategy having more meager benefits but smaller costs. But no one wants to accept that there are trade-offs for important matters in life, that the universe appears to conspire to keep them from having too much of a good thing. When times are dangerous and people wild, worry-warts crusade to curb promiscuity, which predictably results in shallower emotional attachments between lovers as people become less hormone-crazed. Then the crusade moves to curbing psychologically distant hook-ups, which will predictably be accompanied by a rise in how horny people are, thus driving promiscuity back up again.

As long as the audience members don't call them on it, professional and lay moral crusaders have a guaranteed full employment plan. Only by telling them to grow up and accept that life has many trade-offs will we clear our minds of the noise of their incessant hand-wringing.

May 23, 2010

Cover songs: A case study of greater competition and specialization failing to improve quality, which applies to cultural markets generally

Since the 1991 death of fun pop music, you'd think that new artists, unable to make quality songs of their own, would at least be able to make enjoyable covers of ones from a better era. Yet even recording a great cover song is hard -- just ask any 13 year-old kid with a guitar who mostly plays other people's songs. It's not just a matter of skill, since some of the originals do not showcase a virtuoso performance. It's more a matter of getting into the right state of mind and being able to channel the original in your interpretation. When the current zeitgeist clashes too much with the one in which the original was written, it is nearly impossible for the new artist to get it right.

How can we see this? An objective way is to look at the chart-topping songs of some year and see whether they are covers or not. Sticking just to the Billboard charts, 1987 saw quite a few cover songs that were good enough to make it to #1, whereas no covers in 2005 were that good. (I chose those years because 1987 is a local peak before the 1991 shift to bad music, and 2005 is a local peak after the shift.) The fraction of all chart-topping songs that are covers is always going to be low, sometimes 0, but this way points us to where those minority of great cover songs are.

Here is a subjective list of 50 great cover songs from The Telegraph, and the truly good ones are almost entirely from the late '50s through 1990 period. In Wikipedia's entry on cover songs, again notice the attention given to ones from that period, and how little focus is given to covers from roughly the past two decades.

Good cover songs don't benefit from being close to the originals in time or style. Some like Elvis' "Blue Suede Shoes" or The Byrds' "Mr. Tambourine Man" were released shortly after the originals and are fairly similar stylistically. But plenty of others like Tiffany's "I Think We're Alone Now" or Soft Cell's "Tainted Love" were released 15 to 20 years after the originals and belong to different genres. What they share is a wild and fun-loving cultural and social environment, which has effects on both the supply and demand sides: the music-makers can easily step into the mindset of one another, and the audiences will have similar tastes from a bird's-eye-view. When the zeitgeist is ruled by self-conscious killjoys, artists will find it impossible to enter into the mind of some other good songwriter, and the audiences wouldn't want such a product anyway -- that would be too retro.

This shows how little the quality of music means to most consumers. They would rather listen to newer garbage than older hits. During fun-music times, there is no trade-off between originality and quality, but in boring-music times there is. If quality mattered more, the pop music industry would be like that for classical music. Rather, all you hear on the radio, in clubs, in Starbucks, on TV, and what gets downloaded on iTunes is new garbage. (In addition to mopey and sappy stuff from the '90s and 2000s, Starbucks also plays music from the boring period after the Roaring Twenties but before the late '50s. They are exact in their aim to avoid playing exciting music.) You're lucky if the dance club plays the originals of "Blue Monday" or "Personal Jesus" rather than the sleep-inducing covers by Orgy and Marilyn Manson.

It's not an original insight to point out that most people in modern societies use music as an tribal membership badge, however it looks like this is not just a function of music but the primary function for most people. If an item has already been used widely as a membership badge by some other tribe, and if people today are generally aware of its has-already-been-used status, then it will not be special and unique to a current tribe. That's why they draw their membership badges from current movies, TV shows, songs, etc. -- no other tribe has already used them.

In safe and boring times, tribal turf wars are more frequent and petty, unlike during wild and dangerous times when people set aside some of their differences to stand up to a larger common threat. (We can use the homicide rate to measure how safe vs. dangerous an environment is.) When tribalism increases, it's even more useful for them to use bad cultural products as their membership badges -- anyone who's not very committed to the tribe can enjoy good culture, but only those who are serious will pay the high cost of listening to Fall Out Boy and watching Family Guy.

These dynamics are what keep most cultural products from constantly improving in quality as market competition and specialization proceed, as opposed to things like batteries or light bulbs that people don't use to signal tribal membership; there prices fall and quality improves steadily over time. Of course, as more and more things begin to fall under the list of junk that people use to "say something about their personality and lifestyle" -- really what tribe they belong to -- the gains in our standard of living will stall out or even reverse, as with our deteriorating health that has accompanied the gradual shift about 30 years ago to a diet that is mostly vegetarian. It is now low-status, backward, and not respectable to eat the mostly animal-based diet that homo sapiens is adapted to.

Economists are right to say that more competition and specialization tends to lead to things of greater value to consumers, but that neglects these increasingly large domains where what's of value to consumers is not objective-although-fuzzy quality but rather tribal affiliation.

May 20, 2010

Movies lied to me about Los Angeles

After spending a few days in the San Fernando Valley to attend a graduation, I'm struck by how unprepared for it I was, based on my cultural conditioning. Sure, I knew the statistics that Los Angeles has the largest concentration of Mexicans and illegal immigrants in the country, and that while some areas are nearly 100% this way, even the other areas (aside from really rich ones) are close to a tipping point, etc. But I didn't have any mental images or vivid experiences to this effect.

Here is a list of movies set in Los Angeles -- see how many of them from more recent years even touch on Mexican lifestyles, illegal immigrants, and so on. Maybe two or three, despite the astronomical rise in these populations in the wake of the 1986 amnesty. The resulting baby boom produced a generation of teenagers during the 2000s, and normally you'd expect a baby boom to command attention from the culture-makers. Look at how fascinated (or horrified) the media were during the Valley Girl's hey-day of the early 1980s when the 15-24 age group swelled to its largest share of the population in recent history.

And since the media are only giving their audiences what they're interested in, we infer that unlike the average news reader of the early 1980s who hungered for information about Valspeak -- whether to imitate it the better or to confirm their fears about how low the culture was sliding -- the news consumers of today find Mexican young people incredibly boring. This is not an effect of whites viewing a white vs. non-white group, since whites were and have been fascinated by the black culture of L.A. and other major cities, and this regardless of the black people's class -- upper-class families like those of The Cosby Show and The Fresh Prince of Bel-Air were just as popular as the middle-class one in Family Matters and the lower-class ones of the gangsta scene in music and movies.

Africans are simply more culturally creative and dynamic than Central Americans. We can judge the success of each group by how widely their cultural products are adopted throughout the world. Even within Latin America, no one cares about ranchero music outside of Central America, while the more African-influenced music of the Caribbean Hispanic groups -- who are numerical underdogs -- thrives throughout the region. Outside of Latin America, I don't think anyone could identify a single genre of Central American music by name (mariachi perhaps) -- all those Spanish names they'd rattle off in hopes of getting it right would actually be Caribbean, or maybe Brazilian (another country with substantial African influence). "Cross-over" Hispanic musicians also come nearly exclusively from Caribbean groups or those from other places of heavy African influence (like Shakira's area in Colombia).

African-area Hispanics are also more exciting to interact with for the same reasons. Both sexes take more of an interest in their appearance, and it's an appearance worth taking an interest in. The girls have more curvaceous bodies and asses that are as armamental as they are ornamental -- in case she needs to knock that other bitch out of her way on the dance floor. And the guys have less body fat and more muscle mass. The typical Central American you see in Los Angeles (or I presume elsewhere) has a much more doughy build and the females have no hips, which is not unusual for Amerindian groups. Turning again to the judgment of the world's peoples, how often do you see Africans vs. Amerindians as models or in advertisements generally?

Really the only area of culture that Central Americans dominate in is junk food like tacos, bean & rice burritos, and other carboholic garbage. These mostly meatless meals reflect the lack of large game animals in the New World after the present-day Amerindians arrived and wiped them all out. In Africa, big game animals co-evolved with humans, becoming ever better at evading our (ever better) methods of attack and escape, thus avoiding extinction. Even in Europe where big animals were not hunted but raised as livestock, they still contributed to the diet (they did too for African pastoralists). So where Latin American culture is mostly a fusion of European and African influences, the food befits a human being and is full of animal flesh, while the more indigenous cuisines are hunger-causing heaps of grains, pulses, and vegetables. During the anti-fat, anti-animal products scare of the past 30 years, this junk has become more widely tolerated and Mexican food has exploded in popularity as a result. But deep down everyone knows that, however guilty they (wrongly) feel about it, chewing over Brazilian churrasco is more satisfying than shoveling Mexican grains down your throat.

Getting back to my main point about cultural depictions of Los Angeles, most of those image-makers come like I do from the east coast where Central American influence is tiny and African influence is huge. When Jewish directors who grew up in New York go out to film in L.A., they bring with them their expectations about who the non-white ethnic groups are. This renders Central Americans invisible in their movies, and thus invisible to the average American movie-goer, some of whom will go on to direct more movies set in L.A. that ignore Central American influences. Really the only way to break out of this is to just go there and see real life for yourself. It's no longer the mecca for energetic and sanguine white teenagers that you see in Fast Times at Ridgemont High (an accurate portrayal back in 1982), but increasingly a cultural no-go zone of embittered and demoralized Mexicans.

This lack of energy and palpable sense of resentment is something you have to go there to feel for yourself. All tribes say that their group is superior to the many out-groups, and in most cases they appear to believe it or at least behave as if they believed it. It fosters a certain cockiness, as when the young females of Tehrangeles unabashedly talk about how don't worry if you like us, i mean c'mon everyone knows persian girls are hotter. i mean i'm not trying to sound stuck up, but i'm not gonna lie, that's just how it is. Going back to Brazilian restaurants, their almost flirtatious enthusiasm is hard to miss -- they're confident that their food is going to make you orgasm.

But Mexicans serving you their culture (at least in the US) have a barely concealed look of resentment on their face, like it's curling their toes to have to interact with you. Again this is not an effect of being a Non-Asian Minority since any black-owned restaurant has the typical feel of a group that believes it's better than the others. There's that sense of pride and the confidence that they're going to knock your white-boy socks off. Just look for genuine, not forced, smiles. It must put a real chip on a Mexican's shoulder to know that non-Central Americans only eat their food when they want some quick junk rather than a fulfilling and civilized meal, and to always get asked whether they like the music of Shakira or Jennifer Lopez or some other superstar from a non-Amerindian region. That's the real sting -- to be beaten out by blacks. That's why they look at you like you're a traitor -- "Hey, you're white, you're supposed to be on our side, not the Africans'."

This expanding smoggy cloud of bitterness and dejection threatens to suffocate our culture's nervous system, but because it's being spread by high-fertility Mexican immigrants rather than a handful of white dorks who listen to Marilyn Manson, we cannot bring ourselves to talk about it -- or to see it -- at all.

May 19, 2010

Graduation speech hypocrisy

At a graduation today, the first speaker blathered on and on about how human beings' evolutionary history was fairly egalitarian -- no one could hoard much wealth -- and social like honeybees. He then went on about how little material wealth seems to matter after a certain point for longevity and happiness, whereas income inequality seems to curtail them. He wasn't dispassionately noting a curious correlation but speaking as though inequality causes shorter lifespans and less happiness, as a warning to the graduates.

The speaker, decked out in academic regalia whose bright colors distinguished his rank from those of the other faculty present, then called our attention to the students who were to receive honors for hard work and achievement, who were picked out by a different set of colorful sashes or ribbons or whatever, so why don't we all give these elite students a round of applause.

And he led this specialifying ritual without any apparent regard for draining the happiness and cutting short the lives of those students who received no distinction, who could only look on in envy at those who were smarter or worked harder. Even worse, imagine the deleterious effect this ritual will have on the invisible students' family and friends, whether in attendance or who hear the non-news later on. This second-hand envy will spread the epidemic of short lifespans and little happiness even further.

We wouldn't let cigarette companies get away with this level of polluting public health. While we all perceive the evils of Big Tobacco, the threat of little academics is no less pervasive in the aggregate. For the psychological fitness of future generations, therefore, we must shut up the distinction-drawing blabbermouths at graduation ceremonies. The honor roll has already killed enough children -- don't let your kid become just another statistic.

May 17, 2010

Porn during tragic vs. trivial times

Returning to the post and comments about what art is like when created in a violent vs. a safe environment, it's worth emphasizing that this difference shows up even in low culture. Take porn. Actors in scenes from movies made before the 1992 fall in crime are sincere, psychologically abandoned, and convey a sense of urgency -- like, let's enjoy this magical act before something does us in. The sounds they make are mostly moans and sighs, with the occasional "do you like that?" and "ooooh, that feels so good." You can't help but notice how much they laugh and smile. Sex was just one part of a larger culture of merriment, and you can tell both the guy and girl are totally into it.

Shot during the height of the last violent crime wave, here are two scenes of wild and crazy young people enjoying the pleasures of the flesh (obviously NSFW): Tori Welles with Peter North and Megan Leigh with Tom Byron.

During the mid-late '90s, as all forms of wildness start to decline, we see a steady transition to the porn that is typical of the past 10 years. The performances are not sincere but so overly exaggerated that you wonder whether in a previous lifetime they were professional wrastlers. They are not mentally lost in the moment but incredibly self-conscious, often hamming it up for the camera. The guy never feels any magic or looks like he's experiencing the full range of emotions that we hear, for example, from the speaker in "Little Red Corvette." The girl looks even less psychologically invested -- she might as well be scrolling through her backlog of text messages. (Notice how often you see this in nightclubs nowadays when a guy grinding her from behind gets less attention than her cell phone.)

And there's no sense of urgency -- it's like, "Well we're both being paid to be here, so I guess we might as well have some sex." It's no surprise, then, that they rarely show genuine smiles or laughter; if they laugh, it's always one of those meta-ironic laughs that just worsens the off-putting self-aware feel of the scene. But these minor visual offenses are nothing compared to the assaults on the ear -- no moans or sighs, but instead a bunch of affected grunting. And imagine hearing this in a clockwork rhythm with a deadpan inflection, aside from a Valley Girl rise at the end: "oh fuck yeahyeahyeahyeahyeahhhhh..? oh fuck my little tight pussy...? oh fuck yeahyeahyeahyeahyeahhhh...?" It makes you scramble to find the mute button. Who knew naked chicks could be such boner-killers?

Thrown together sometime in the past couple years, when wildness has been at all-time lows, here's a scene of some couple of soulless retards (again NSFW). Is it over yet?

While I don't consume much porn, from now on it's pre-'91 stuff only. Oddly enough those are the scenes I first saw as a teenager, even though they were made 5 to 10 years before. During a typical snooping-around adventure, I discovered a VHS tape of my dad's buried within a filing cabinet and that had no label -- wow, hidden treasure! I wonder what kind?! It was a video encyclopedia hosted by Hyapatia Lee with clips from the '80s used to illustrate sexual terms. It even had that scene linked to above between Tori Welles and Peter North -- never thought I'd find that one again. I couldn't begin to guess how many times I rewound that scene (or others on the tape) during high school, whereas the '90s movies that I saw when I turned 18 never drew me in so much that I would grow somewhat fond of them. And of course all the junk I downloaded in college in the early 2000s proved disposable.

Just think of how bad the poor teenage males of today have it -- not only are relations between the sexes a lot less wild than before, they can't even escape to the refuge of quasi-mystical, let-it-all-hang-out porn. Guys of a certain age should feel grateful for at least coming of age during the transition after wild times and not during the doldrums of the past decade and counting.

May 16, 2010

A quick proof of innate male superiority in musical ability

Men dominate the field of music, no matter what genre. In Charles Murray's book Human Accomplishment, only one female shows up on the list of eminent Western composers and ranks near the bottom. All the way down in popular music we see the same pattern. The obvious answer is that there's some cognitive skill regarding music that men are naturally more endowed with, given that females seem to do OK writing novels.

Still, many whose explanations don't rely on overt discrimination against women tell stories about how women could excel as composers or rock musicians -- they just allocate their time differently and thereby don't log all those necessary hours of practice to make it big. Child-bearing is an easy example of something that would keep them from focusing as single-mindedly on musical creation as men do.

But tonight at the dance club I noticed something very strange -- there was a female DJ pinch hitting for one of the regulars. It never occurred to me before, but there are almost no female DJs on the radio or in nightclubs. The argument about not being able to devote most of your time to creating music due to other constraints, while not persuasive, at least it gets off the ground. But forget about composing and performing -- women can't even pick music that's already been made!

Sure you need to put in the time to build up a mental library of songs in order to piece together a good playlist, but this isn't so labor-intensive that it's either that or raising kids. It all depends on how good of an ear you have for music -- what songs to include, and in what order, is not a simple matter for most people (as you'll find out by listening to the playlists on their iPod). Those who have a knack for music, though, can churn out a decent playlist without much effort. They're the ones who get gigs as DJs; they typically have a day job, which drives home how little of your time is required if you've got a good ear.

The market for DJ services is viciously competitive, so discrimination will be close to non-existent. If a nightclub owner favored a poor DJ just because he was male, he'd go out of business in a couple of weeks. There's also no old boys' club for DJs in most genres -- techno, industrial, hip-hop, etc., which rely more on DJs than on live bands, all began a decade or more after the women's liberation movement of the early 1970s. There's no cabal of dudes dressed like the Monopoly guy who control entry into the gothic DJ syndicate. Moreover, most of the audience on the dance floor is female, so it's not a matter of having to serve the tastes of the opposite sex who you don't understand as well as your own. And the DJ booth is insulated from the rest of the room, mitigating stage fright.

We conclude that men tend to have a better sense of what sounds good in isolation and especially what songs and in what sequence will create the most pleasing gestalt. Some girls get it -- they typically wind up working at used record stores -- but in general they're going to pay too little respect to the Rolling Stones relative to the Beatles, to T. Rex relative to David Bowie, and to Schubert relative to Chopin.

Women overall have a tougher time than men stepping outside of what immediately and directly speaks to them and appreciating culture that isn't exactly their thing but that anyone with good sense can still tell is good. The evolutionary cause of this is men's involvement in very wide social networks, while women have specialized in smaller, more closely knit circles. If you're schmoozing and making nice with people from other groups outside your close kin, you'd better be able to enjoy a wider range of culture, lest you offend them with your chauvinism and lose out on the fruit of all that social politicking.

The female DJ tonight did manage to slip this one into a somewhat contemporary-oriented playlist, which was a savvy move, though she was wearing a shirt and tie, so she probably has a masculine brain.

May 12, 2010

Hair and eye color in adult film actresses

All right, enough talk about art and violence. Time to focus on bigger questions like "blondes or brunettes?" Adult Film Database lets you filter people by hair and eye color, although they have lots of shades of hair color. I collapsed those groups into red, blonde, dark blonde / light brown, and dark brown or black. There's going to be more error in making fine-grained distinctions based on photographs, and these are the categories used in a study of how common hair and eye colors are in the overall population. The girls are not weighted by how popular they are (e.g. by how many movies they've been in), but before I showed that among the most sought after ones darker colors prevailed.

Here are the results (click to enlarge):


For comparison, here's how frequent these colors are in Northern European populations. For eye color, green is about 2/3 as frequent as we'd expect, blue / grey is under half as common as we'd expect, while hazel / brown / black is over 5 times as frequent as we'd expect. Relative to the population, red hair is just over half as common, blonde is 1.8 times as common, dark blonde / light brown is 1/5 as common, and dark brown / black is 2 times as common.

The over-representation of the more extreme colors -- lighter blonde and darker brown -- and an under-representation of in-between colors also showed up in Maxim cover girls. It could be due to what different raters code as light blonde, dark blonde, light brown, and dark brown. It could also be that light blonde and dark brown really are over-represented because if you're a blonde-loving guy, you want them blonde, and brunette-loving guys want them dark. The more intense signals in either direction are what's chosen for.

This approach based on over-representation really misses what the data are saying, though -- namely, that men favor brunettes and dark-eyed girls, although not universally, because a higher fraction of porn girls are dark-colored than light-colored. How that distribution relates to the distribution of colors in the population is irrelevant. Over-representation only tells us indirectly about the traits a group possesses.

For example, Ashkenazi Jews are over-represented in academia, but that doesn't mean those who hire academics have a taste for Jews -- if so, shouldn't they be close to 100%? Rather, it says that Jews must be more likely than other groups to have the traits that academia truly selects for, such as high intelligence. That is where we do see close to 100% representation -- everyone in academia scores pretty well above average in IQ, especially in the departments that don't exist to house quotas or to give dumb students something to major in.

Similarly, the fact that blonde hair and blue eyes are a minority in porn girls shows that most guys don't have a thing for blondes or blue eyes. If we found that blondes were over-represented, that only means there's some other group of traits that blondes are more likely to have than others -- like maybe they have cuter faces or smoother skin (just to make something up, as I think paler people would have more damaged skin), which are things that all men truly obsess over in evaluating good looks. It's these signs of health and youth that men focus on, not so much what color her hair and eyes are. In fact, Adult Film Database has categories for bald, grey, salt-and-pepper, and white hair -- yet there are virtually no women from these groups! Like IQ for academics, having color still in your hair is a necessary condition for females to look good. What particular color it is doesn't seem to matter so much, although darker is better, so long as you don't look old or sickly.

These data cast further doubt on the idea that blondes are more likely to have whatever other traits are truly selected for, as this idea requires that the women listed as blondes be natural blondes. It's strange that blue eyes are only half as common as in the overall population, while light blonde hair is twice as common, given that light color goes together across the eyes, hair, and skin. This suggests that a lot of the blonde girls are natural brunettes -- they can disguise their hair color easily but not really their eye color. (They could use color contacts, but that doesn't keep the maintainers of a database from easily learning what their natural eye color is.)

So the picture looks like, compared to the population, lighter colors are either as common (natural light blonde hair) or less (blue eyes), suggesting that darker-colored females are more likely to have whatever other traits that really matter to males. It could be something as simple as better skin, as people with dark hair and eyes also tend to have darker skin, which doesn't age as poorly as pale skin does. Also it seems like darker hair is more voluminous and lustrous, while pale blonde hair tends more to fall flat against the scalp. And based on the wider animal world, darker pigmentation reflects a less tame nature; domesticated animals tend to get lighter eyes and coat colors. So a more wild and fun-loving nature could be another trait that brunettes are more likely to have.

It is these signals of youth and health that men truly care about -- perfect skin, vibrant hair, and an exuberant disposition -- rather than the color of hair or eyes per se. To the extent that any color preference exists, it is clearly for darker colors, especially in the eyes.

May 10, 2010

How waves of violence lead to better artistic production

If the "wild times" theme I've been exploring here for awhile is on the right track, then not only should we be able to guess what cultural differences we'd see between safe vs. dangerous times, but we could predict how much more dangerous one environment was compared to another based on their differences in cultural output. To simplify perhaps too greatly, it seems like the culture treats deeper matters, in a sincere way, and with emotional appeal during dangerous times and grows obsessed with superficial things, in an ironic way, and with a distaste for emotional appeals during safe times. At least that's the picture from the most recent rise and fall in crime rates (roughly the '60s through the '80s for the rise and the '90s to the present for the fall).

I asked readers to use that picture to guess what the other two major reversals of the downward trend in violence have been since about 1500. Everyone except for Sid wimped out, but he came pretty close in guessing the Thirty Years War and the Napoleonic Wars. The first major increase occurred from roughly 1580 to 1630, and the second from about 1780 to 1830. Data from that far back don't allow a precise carving of boundaries, so there might be a margin of error of at least 5 and maybe 10 or so years, but there is a consistent pattern for many European countries where the homicide rate shoots up in the late 16th / early 17th C. as well as in the late 18th / early 19th C. See here for graphs (just the ones for England, Scandinavia, and Germany / Switzerland).

What two major artistic breakthroughs would have led you to guess those time periods? The hints about deep subject matter, sincere treatment, relative lack of caricature, timeless and universal appeal, packing an emotional punch, etc., obviously point to Shakespeare, but it includes the Elizabethan period broadly. And aside from some boring stuff by Ben Jonson and the like, that period arguably lasts through the Metaphysical poets as well (at least for the traits mentioned before). Paradise Lost might throw some people because it belongs too, but it was more anomalous for its zeitgeist, whereas Hamlet, Doctor Faustus, and the Holy Sonnets all fit into a larger phenomenon. The second period is even easier to guess because it was so much more widespread, recent, and productive (in sheer volume) -- namely the Romantic revolution. In the previous post, I also noted that there was an apparent surge in violence during the 14th C (it's not as clear as the other cases), which you could easily have guessed using Chaucer, Dante, Petrarch, and Boccaccio.

Isn't it remarkable -- compared to the periods above -- how little work in this style came out during the 15th and most of the 16th centuries, for the majority of the 17th and 18th centuries, and from the mid-19th to mid-20th centuries? Most college graduates with a liberal education would struggle to name anything that has stuck in their minds between Chaucer and Shakespeare. In a typical survey course, you'll read a fair amount of post-Metaphysical yet pre-Romantic literature, but aside from Milton it's pretty boring to the average reader. By and large it's too snarky of a period, and while Swift is too hilarious not to still enjoy, I remember almost falling asleep while reading Dryden, Pope, et al. for my intro English class.

It's no coincidence that during these safer times, following an unusual period of surging violence that made people focus on the big picture for once, forms like the mock-heroic dominated. That's basically every comedy movie made since the post-1991 fall in danger and wildness. Sure there's the occasional success like The Big Lebowski, but overall the form is dull, predictable, instantly dated, and just annoying to anyone outside of the tiny in-group that's satirizing some other minuscule rival tribe of theirs. Jesus Christ, give us something we can sink our teeth into! You can still take an unlikely hero, point out his shortcomings, and make the story more farcical by having the fate of the world depend on his efforts -- just ditch the endless mockery, make the character sympathetic, and have fun with the story. Suddenly, you get a movie like Stripes instead of Will Ferrell as a NASCAR racer.

After the Romantic movement died, the return of intra-elite status-jockeying didn't employ the more refined attacks of the previous safe-time era, but most of that Naturalist and Realist stuff is old wine in new bottles. What elite cabals were competing over in their status contests had changed -- no longer how ridiculous so-and-so's wig looked, or how poorly somebody struggled to master diction. Now the elites fought over who could best take paternalistic care of the benighted masses and provide a voice for the voiceless -- that other elite group is just so clueless about policy, I mean when will they ever recognize how brilliant and noble our side is?

It's true that during the recent crime wave, there was a lot of talk on this topic, continuing an unbroken trend since at least the beginning of industrialization, but it still had a more Romantic ring to it. During the '60s through the '80s, people were going to cast off the chains of the oppressed, make "Hey buddy, I got your back" into policy, and in general do what naturally felt good doing. None of that impulsive idealism comes across in Dickens. It's more of an extended political cartoon, replete with caricature, to savage his rivals the political economists. During the '90s and 2000s, we've returned to this mode of political debate as a petty contest over which side is clever vs. clueless, more than a grand debate about who is moral vs. immoral. Before, Republicans were supposed to be evil and world-conquering -- during safe times, they're supposed to be stupid and provincial.

I think this also explains why, apart from the artists whose popularity remains constant (either high or low), some go through fashion cycles. The most obvious example is Jane Austen, the lone holdout against the larger Romantic era that besieged her. Judging by how often she's mentioned in the NYT, her popularity began shooting up in the mid-1990s, just as the culture was making the shift from dangerous to safe times. One half of the rejection of the urgency-driven culture was to act like a spoiled brat -- hence alternative music and The Daily Show. But that left another half wanting to laugh and smile while still rejecting emotionalism and yet not primarily by snickering at a rival tribe, and Jane Austen filled that niche (not just her own works, but popular adaptations like the 1995 movie Clueless, which is a lot less heavy than teen movies from wilder times).

So in general, it seems like periods when violence swings upward produce more enduring works of art. My hunch about the mechanism is just that when you perceive life to be cheap and short, you shift your priorities to focus on short-term survival and reproduction rather than long-term security and stasis. No time to chortle at the other tribe being lampooned on TV -- we've got to protect each other against a common threat and then go do what it takes to make some babies! It is clear which approach to art all future generations have found more exciting.

May 9, 2010

Even Generation X ditches the '90s, joins '80s revival

Here's an NYT essay on Generation X's midlife crisis. As a cohesive generation, they could make sense of the phrase "traitor to my generation," unlike those born from 1958 to 1964 or from 1980 to about 1986 or so. And a lot of their mannerisms and styles of dress are still stuck in a 1992 media studies seminar. Still, I've been surprised by how much they are coming to realize how great the pre-1991 culture was, given how terribly they vilified it during their hey-day.

I haven't seen Hot Tub Time Machine or Greenberg, so I'll have to take the author's word for it that they paint the '80s in a charitable light, especially when the characters see how little the Millennials (born after 1987 or so) appreciate Duran Duran, chasing girls, and so on. Last year there was Adventureland, and while the actors' performances belonged more to our own meta-ironic age, the setting and the soundtrack captured the culture pretty well. The same is true for Donnie Darko. I struggle to keep up with movies, so there may be others in this vein, but that's a good number right there for such a specific kind of movie.

Although as young adults they mocked the wild culture of the '60s through the '80s, Generation X has had enough time to reflect on it and realize that it was a lot more fulfilling than masturbating about the politics of identity, cracking sarcasm, and pretending to be too cool to fuck. They're going to cherish Heathers and the better John Hughes movies more than Reality Bites or Boyz in the Hood, post-punk and college radio over alternative and indie, and -- though few will admit it consciously -- policy under Reagan (not so much Bush) over policy under Clinton.

I have noticed an exception to that general trend, though: the youngest Gen X-ers, born between about 1975 or 1976 and 1979, are more likely to stubbornly adhere to alternative or gangsta rap, third wave feminist ideas -- such as believing that chasing pussy is a waste of time or a human rights violation -- and staying inside rather than going out. That's just an impression, but it's based on very few of them joining in the above recollection of what was great about the pre-'91 culture. The rest of Generation X, born between 1965 and 1974 or '75, has vivid first-hand memories of coming of age during carefree times. They might not be able to recreate it on film these days, as we saw with Adventureland, but you can tell that those who actually lived through that zeitgeist would love to do it again.

Those in the tail-end of Generation X, however, hit puberty right as the culture wars were erupting. They may have memories of what childhood was like in an easygoing, anti-helicopter-parents environment, but they don't have any first-hand experiences of that world as adolescents or young adults. (For the same reason, these late Gen X-ers do have the best taste in video games, for whatever that's worth.) That must make it even easier to caricature the pre-'91 culture as merely the excesses of a bunch of hippies, disco dancers, and material girls. It's harder for the rest of the generation to believe that because they have plenty of counter-examples piping up from within their own minds.

But for the most part, even the pioneers of meta-irony are growing aware of how lame it is to hurl sarcastic barbs all day long in a status-seeking contest among the hip, and how much more satisfying it was when people didn't crack wise so much because they were too absorbed in living their exciting lives. That shows a good deal of maturity for the generation of allegedly perpetual slackers.

May 8, 2010

If playlists are so great, why don't compilation CDs sell?

In this list of arguments against mp3s, I argued that searching for, buying, and listening to mp3s degrades the quality of the listening experience because it causes you to focus on individual songs isolated from one another, rather than on albums that cause you to focus on the larger hanging-together of the songs.

During any episode of listening to music, you might cover up to two hours, which is enough time for about 40 songs. If you are drawing those from the tens of thousands in your iPod, how can you be sure that those 40 were the best complements to each other, and how can you be sure that the order you listened to them in made this gestalt effect as great as could be? Obviously you can't. In fact, most people are just putting a huge number of songs on shuffle, or are choosing the 40 on-the-fly one after the other with no thought.

The collection of songs and the order they appear in on an album, by contrast, are the result of painstaking labor on the part of the musicians themselves and recording industry professionals. They sink or swim based on these decisions, while you don't at all, so their crafted album is going to destroy your slapdash playlist.

That's all obvious enough, but is there empirical evidence that it's true? Sure: if throwing together a bunch of good songs were better in quality than making a focused album, then compilation CDs would sell better than albums. The medium is the same, the search time is the same (in the same store), and the price is often lower for compilations! When the only real difference is the quality of the listening experience, almost without exception people prefer albums to compilations.

An iPod owner's 15-song playlist will be even worse than a compilation CD, though, because the issuers of the latter at least have some incentive to think about which songs to include and in what order. Generally they won't work well with each other because they weren't composed with the others in mind, drawn as they are from so many different musicians and teams of recording professionals and zeitgeists. But still, they've thought about it more than you have. If compilation CDs don't sell well, your playlist burned onto CD would sell even worse. The objection that pre-made compilation CDs don't sell well because they can't be customized by the individual as a playlist can is also bogus. Recall that no one actually customizes their playlist, in the sense that people used to spend hours thinking about which songs to include on a mix tape and in what order. Generating a playlist, by contrast, is a thoughtless process, not one of fine-tuning the list to your unique tastes.

I think everyone realizes this because no iPod owner, aside from a handful of indie dorks, is arrogant enough to see themselves as a playlist magician -- like, "Hey, I'm so good at this, I should turn it into a hobby or make some money off it!"

Greatest hits CDs by a single artist do a lot better, of course, for the simple reason that the songs come from a common source and will naturally mesh together better than if they came from all over the place. I'm not making the hardcore fanboy argument that greatest hits albums are for housewives and little girls -- which they are -- but simply that the original albums sound better. Although the greatest hits come from a common body, they don't necessarily come from the same mind. Artists change over time, and including songs from too-different periods can clash -- even worse if the artist gets worse over time, and in listening to the greatest hits you feel more depressed from watching their energy fade away. Madonna's multi-platinum greatest hits album, The Immaculate Collection, is like that -- it all goes to hell starting with "Express Yourself," and it's a real downer having to halt a great stream of music in the middle of the CD.

So for mp3 slaves who insist on a lower-quality listening experience, a good rule-of-thumb would seem to be restricting the songs you're drawing from to come from a single artist. That's still not the same as a greatest hits CD -- what songs to include and in what order are still left up to someone who has little incentive to get it right -- but it's a lot better than putting the entire archive on shuffle or cobbling it together as you go.

May 7, 2010

Historical cases of wild times making longer-lasting culture

An idea I've been pursuing a lot here lately is that during times when the world seems like it's going to blow itself up, people stop dwelling on petty crap and focus more on the big questions like love and death, good and evil, coming together to fight a common threat, the exuberance and pardonable recklessness of wild and crazy kids, man's reach exceeding his grasp, and other timeless and universal themes. That's why Ghostbusters and the first Star Wars trilogy will forever outlast The Daily Show and "frat pack" movies in popularity.

I've been using the homicide rate as the proxy for when times are getting wilder or tamer and then looked at how the cultural changes reflect changes in how dangerous the world is. But having established this rough pattern, we could test the idea in some completely different time period. And just to show that I'm not just shoe-horning the cultural zeitgeist into a "wild" category when I know ahead of time that the homicide rate was rising or steadily high, let's make the prediction in the other direction.

We have pretty good homicide rate data for several European countries that goes back to 1200 or maybe a little later, depending on the country. In general the rate has been steadily falling since sometime around 1400 or 1500 or 1600, again depending on the country. But there are three major reversals of this downward trend -- one is the recent crime wave lasting from roughly the 1960s through the '80s. Try to guess the other two. They occur at about the same time in England, Germany and Switzerland, and Scandinavia sometime from about 1500 until the third reversal around 1960. Each reversal lasts about 40 to 50 years before returning to the downward trend.

Again, the prediction is that the culture produced will not focus on backbiting among squabbling elite tribes because they will be more obsessed with more important matters, and they will be more sincere and emotional than mocking and detached. For these reasons, they are probably periods of culture that are much more likely than others to be passed down -- at least to, say, high school students in a literature class, in contrast to grad students in English lit who have to plow through everything. You might encounter an item or two from falling-crime periods, but these two periods when the crime rate shot up for over a generation will be much better represented in a standard curriculum.

I'll post the answer on Monday. You can answer with dates or with the names of a period or movement.

As an aside, it seems like the 14th C. saw an increase in the homicide rate, but the data don't go back very far before this, so it's not really an easy call to say that it was increasing rather than merely not falling. Still, it at least looks that way for England, and probably for the rest of Europe, given how hellish we know that century was across the continent. And sure enough that's when we get Chaucer, Dante, Petrarch, and Boccaccio. In a typical lit class, hardly anything is covered before 1300 or during the 1400s, at least compared to the jam-packed 14th C. And when life appeared so cheap and brief, was it any wonder that they dwelt so much in the late middle ages on teenage beauty? Since the boringization of the culture in 1991 we've hardly seen any adolescent heart-throbs in the spotlight, aside from Alicia Silverstone in the early-mid '90s. So that might be another clue to look for in predicting when the homicide temporarily reversed its downward trend.

May 3, 2010

Following up

- Not that it was a tough prediction to make, but sure enough the new Nightmare movie stinks. Rotten Tomatoes has 101 reviews, only 15% of which are "fresh" and where the average rating is 3.9 out of 10. Metacritic has used 24 reviews to come up with their rating of 35 out of 100. Even at IMDb, where voters are a lot friendlier (because they expect the readers have not seen the work yet), it only gets up to 6.1 out of 10, based on over 3,300 votes. In contrast, the original movie gets a 95% "fresh" rating and average score of 7.7 at Rotten Tomatoes, 78 at Metacritic, and 7.4 at IMDb. No more terrible overblown remakes -- re-release the fucking original.

- I've been trying the paleo hair thing (AKA beach hair) for a few weeks now, and things have never been better. You only have to shampoo about 3 times a week, your hair is fuller, and it just stays in place better. Just two teaspoons of sea salt dissolved in 8 oz of water in a spray bottle. Towel dry your hair and wait until it's mostly dry but still somewhat damp, spray about two or three sprays onto each major area of your head (left side, right side, etc.), run it through to the ends with your fingers, and let it dry by itself. As it's drying, you can run your fingers through it again just to help it dry and get the basic shape there. The only downside I've noticed is a few of those red proto-pimple things on the very top of my forehead. Probably nothing a little acne cream couldn't help, but they're really not that bad to bother with for me.

In general, mother nature knows better than you do. Girls smell better when they refrain from splashing on Chanel and let the natural perfume in their sweat waft through the summer air. And just as a reminder of how great paleo hair looks: