July 26, 2011

Attitudinal changes in popular songs about drugs

Unfortunately when Amy Winehouse kicked the bucket, no one learned anything about the dangers of drug addiction: they could never put themselves in her place, and so they believe it couldn't happen to them or any other normal person among their friends and family -- only to a complete and total freak like her.

The next time the zeitgeist enters a wild phase, they will be among the first to recklessly experiment with hard drugs, convinced that they're beyond the reach of addiction's hand. They will only find out the hard way that their brains are more frail than they'd thought. So, when there are few reminders that normal people can get sucked into an addiction if they get started, the average person gets complacent and sets themselves up for a rude awakening, either personally or again as it might strike someone they know.

Those reminders thankfully do not have to come from first-hand experience -- believable stories from the broader culture will do as well. So let's have a look at the typical attitude toward drug use a listener of popular music would have heard during the four phases of the zeitgeist cycle, the first and second halves of falling-crime and rising-crime periods. I'm going mostly based on Wikipedia's category of drug-related songs, which looks pretty comprehensive.

During the second half of falling-crime times, roughly the early-mid-2000s through today, the mood is complacency, so songs about drugs will not find much of an audience. "Well yeah we know not to smoke crack, but gosh it's not like we would anyway, so why do we need to hear about it?" Aside from Amy Winehouse's song "Rehab", where she throws a hissy fit about being told she needs to detox, and a reference here and there in a song by Pink, drugs have mostly disappeared from pop songs. This appears to be true for the previous second half of falling-crime times, namely the later 1940s through the '50s.

In the first half of falling-crime times, boredom and nihilism set in, and drugs are permitted, even if not celebrated, as an escapist way to shock some sensation back into the corpse of your social life. From 1992 through the late '90s or early 2000s, this attitude could easily be seen in the gangsta rap trend in black culture -- such as Snoop Doggy Dogg's 1993 solo hit "Gin and Juice" -- and in the dropout heroin chic trend in white culture -- for example "Ebeneezer Goode" by The Shamen in 1992, "Cigarettes & Alcohol" by Oasis in 1994, and "Beetlebum" by Blur in 1997. The only major song that showed the destruction of getting into drugs was Green Day's 1995 "Geek Stink Breath", about meth addiction. However, as I remember it, no one knew what it was about, unless maybe they bought the album and read the lyrics.

It looks like the same attitude prevailed during the previous first half of falling-crime times, 1933 through the mid-'40s, as shown by Cab Calloway's 1933 song "Reefer Man" and Ella Fitzgerald's song "Wacky Dust" from 1938.

When violence levels start soaring, people see that the old ways have not managed to keep them safe, so they switch to a more experimental mindset in order to figure out through trial-and-error how to adapt to the dangerous new world. In the first half, there has not been enough time to sort out the proven successes from the failures among the experiments. So there is a sort of anything-goes attitude about drug use -- it's just part of the overall expansion of consciousness.

There are too many songs in this vein from the 1960s and early '70s to review, but some of the best known are "Lucy in the Sky with Diamonds" by The Beatles, "Heroin" by Velvet Underground, and "White Rabbit" by Jefferson Airplane, all from 1967. There was a major exception -- "Kicks" by Paul Revere and the Raiders, a strongly anti-drug song from 1966. It didn't help the counter-culture that it was a rockin' tune instead of some easily dismissed stuffy propaganda song. Already by 1972, a few are starting to see how high the costs can get, as detailed in "Needle and the Damage Done" by Neil Young.

Once the society moves into the second half of rising-crime times, it's clearly not a passing trouble or two that they have to deal with, and there's been enough time to evaluate which experimental changes have proven better or worse at adapting people to the new world. By now most people begin to push back against libertine drug use. The most permissive it gets is "it's complicated, and probably for the worse," as shown in some songs from the earlier part of this period, such as "Cocaine" by Eric Clapton and "Hotel California" by Eagles, both from 1977.

Throughout the 1980s, drug songs supply their audience with vivid pictures of how free drug use will destroy the user, and perhaps even their larger community if there are enough addicts around in one place.

Strange as it is to believe, rap songs about drugs back then were the most intolerant, especially when compared to the later apathetic and indulgent attitude of gangsta rap. When cocaine and crack are tearing apart your neighborhood, though, it's hard to wave them on by as fun activities for sound-minded adults to take part in. "White Lines" by Grandmaster Flash (1983) spares no detail about the short-term effects on the user's own brain, as well as the downstream consequences for being able to live an autonomous life. "Night of the Living Baseheads" by Public Enemy (1987) paints a grim picture of what life looks like when crackheads start taking over the 'hood. Even N.W.A. wrote a song like that -- "Dopeman" from 1987 -- before they took off in separate gangsta rap directions and glorified and glamorized the drug culture. As late as 1988, Ice-T was telling his audience not to touch drugs and to get hooked on his music instead ("I'm Your Pusher").

In white people music, the only popular song of this period that romanticized the drug world was "Dr. Feelgood" by Motley Crue in 1989. Although it is ostensibly a love song, "I Want a New Drug" by Huey Lewis and the News (1983) details all sorts of horrible side-effects of drug use. "Welcome to the Jungle" by Guns N' Roses (1987) half-glamorizes the street pushers by making their world seem exotic, but the overall message is clear that they're just going to chew you up and spit you out -- "I'm gonna watch you bleed," "Welcome to the jungle baby, you're gonna die."

Also on that album was "Mr. Brownstone," a don't-let-it-happen-to-you-too song about needing more and more to feel the same rush from heroin, as it takes over your life. This image of the drug as a parasite manipulating the behavior of its host was made even clearer in Metallica's 1986 song "Master of Puppets".

It shouldn't be surprising that heavy metal took such an interest in stemming the free flow of drugs, since their fans were like those of the rap genre -- lower on the class ladder, more impulsive, less socially integrated into their communities, and in general having more of a slacker / stoner / dropout / burnout tendency. Maybe the "Just Say No" campaign would work on kids who were pretty well behaved to begin with, and only needed a stern reminder from the First Lady. But the burnouts needed to be warned by their idols who had been to hell and back that it's not all it's cracked up to be.

The last two major songs in the cautionary tale genre were "True Faith" in 1987 by New Order (used in the movie versions of American Psycho and Bright Lights, Big City), and the most well known of all, "Under the Bridge" by Red Hot Chili Peppers from 1991. If the people running the D.A.R.E. program had any brains, they would have made that their theme song and played it in our fifth grade class to drive home what happens to heroin addicts.



Some crimes are the province mostly or only of people with fairly sociopathic personalities -- cold-blooded murder, rape, grand theft, and so on. Normal people aren't going to wander carelessly into this area of human behavior, so they don't need to be reminded that it's wrong. But there are all sorts of lesser crimes and vices that we need vivid reminders about, lest we grow complacent and think we can get away scot-free -- adultery, other forms of betrayal, drug use, etc.

So, there is such a thing as too low of drug use prevalence, and too little focus on it in pop culture. If it's everywhere, society is just about to fall apart. Yet we've never seen that in Western history, so that extreme seems unlikely. But if it's too absent, we forget the dangers, grow complacent, and then the next time a wild phase comes around, instead of being savvy, we wade heedlessly toward the whirlpool because, hey, we've gone strolling along the shoreline before -- what could be so harmful about that vortex out there?

Our drug policy, then, shouldn't have the goal of totally eliminating drug usage. That would look nice and stable for awhile, until the growing hubris would put everyone at risk for a real clobbering once they eventually thought about trying drugs. The same goes for popular culture -- the "family values" movement that wants to keep children and adolescents ignorant of the uglier parts of the real world is only giving them a false sense of protection. Thinking that only Amy Winehouse types could get hooked on drugs, they'll have no self-defense training for when they must confront the decision to start drugs or not, even recreationally. And they'll have the easiest time rationalizing away their indulgence, not having heard those lame and overly confident excuses a million times before.

Remember that the young people who launched the druggie liberation movement of the 1960s had grown up during the naive '40s and '50s under the watch of helicopter parents. It was those who grew up during the later half of the '70s and '80s who developed a sense of caution, whether they were doing drugs or not.

July 24, 2011

The search for social shopping in the internet and big-box age

Durkheim thought that one reason why people in modern economies set up such a specialized system of division of labor was to recover a sense of solidarity that had been lost after we left a primitive way of life, where we were compelled to rely on each other more directly. By distributing tasks so broadly across people, even though it's not necessary to subsist, we commit ourselves to a more social existence and don't feel so out-of-touch with our nature.

Whatever you think about that, something like this seems to have been going on since internet shopping has driven some sectors of the brick-and-mortar world out of business. Buying stuff online is asocial, impersonal, dull to the senses, and unable to deliver instant gratification. While they may be fine with losing these joys of shopping for books, music, and other entertainment in real life, they would still like to shop for something in that more fulfilling way.

They didn't find it in the big box stores, which unlike record stores and book stores have only done better over the past 10 to 15 years. Although you can get your fix for the occasional impulse buy, most of what these stores sell is not up to giving shoppers the pleasure they used to get from browsing through books or flipping through albums and listening to music. And forget about a social experience and face-to-face interactions.

You never take notice of, let alone acknowledge, the existence of your fellow shoppers at Best Buy or Target. And to cut costs and pass along savings to the consumer, they've outsourced as much as possible to the consumers themselves, including the one relationship that you never would've thought could be mechanized -- the cashier. As with online shopping, the great majority who go along with these new norms aren't bothered by them, but they would still like some part of their buying activities to feel human, where there's something more than consumers and self-scan machines.

Where have people turned? To food shops. As Webvan discovered during the dot-com bubble, nobody is going to rely on the internet for their groceries. Unlike books that can sit in an Amazon warehouse forever, or mp3 files that will always be available to download from iTunes, most food that people buy is perishable. Along the physical-matter vs. immaterial-information spectrum, food is very far in the physical direction. So it cannot be so easily scaled up and mass-distributed, like mp3 files, ensuring that there will always be many food vendors in every neighborhood, while record stores and book stores vanish into thin air.

Unlike the big box stores, most of the for-fun food shops that people visit are intimate enough in size to feel like everyone there, customers and workers alike, are at least loosely part of a single community. Plus, no matter how trivial it may seem, there is still a more personal relationship between the workers and customers in a food place since they are making you something right then and there. And not just throwing a pizza in the microwave, but something with a more human touch like setting the foamed milk just right on top of a couple shots of espresso.

Certainly for routine food purchases most people go to supermarkets or even the Wal-Mart-esque ones like Costco. I'm talking about the food places that people go to in their leisure time. That's about all you see in shopping centers these days -- one food store next to another.

Before, there wasn't as much snobbery and polarization among the food stores and their patrons. In a shopping center, there used to be the sit-down restaurant, the fast food place, and the junk food place.

Now that food stores are almost all that remain, the shopping center now has four sit-down restaurants, several fast food stops, a deli, three junk food places, and two Fifties diners. (Speaking of which, since the hoverboard looks like a no-go for 2015, we at least better see a chain of Cafe 80's.) They can only compete so much on price; usually it devolves into an annoying battle over, e.g., which of the three Mexican places is the most "authentic".

As tiresome as the foodie culture can get sometimes, we should remember that going crazy about food vendors is an understandable response to the death blow that internet and big-box retailers have dealt to a lot of formerly fun places to socialize.

July 19, 2011

Trust-building, rites of passage, and over-parenting

The glue that holds together a group of developing kids who are roughly the same age is a feeling of trust that they have actively created through shared rites of passage. They separate themselves from the ordinary world and its structures, level the outside distinctions, and cease competing against each other for a small while in order to lose themselves in the revelry of their crowd, returning to the world of more clearly defined ranks and roles only after having thus transformed their sense of who they are and how they fit in with others.

Throwing yourself into the structureless realm of the betwixt-and-between is a sign to the others in your group that you trust them enough to make yourself that vulnerable. If you believed that they'd take advantage of you, you wouldn't go in the first place. And they likewise signal their faith in your goodness by taking the plunge themselves.

Over the past 15 to 20 years, the trust-bond holding groups of young people together has become so thinned-out that they have mostly come apart. Remember those people who used to pray that there would be less of a "wild roaming pack" behavior among children and adolescents? Be careful what you wish for: that was just an indication of how cohesive and trusting they were.

In the real world, trying to dampen their hormone levels and restrict them from engaging in "risky behavior" -- therefore including all meaningful rites of passage -- will be like squeezing a balloon that is inflating to a larger size than you'd like. Cliques -- at least the intimate, face-to-face ones that form within a peer group -- are open, dynamic systems, so if they're getting too puffed-up, they'll sense that and respond by releasing some air. Treating them like inanimate and unresponsive things leads you to try to shrink the balloon yourself. Not being omniscient, you won't know exactly how much pressure to apply over time. And not being omnipotent, even if you did know that, you wouldn't be able to pull it off without error. You will instead clumsily pop the balloon and send dozens of patches all over the room.

"Well it was getting too damn big!" Yeah, well you should've left it alone to heal itself back to a smaller size on its own instead of intervening only to blow it up. If the group is entirely unnatural, like a gigantic modern bank, then we can't rely on it to self-correct. But not a band of age-mates -- the fact that most parents today treat something so perduring and spontaneously developing in our species as a suspect, unwholesome novelty goes to show how truly decrepit the modern mind is growing.

Fortunately for mankind, there are cyclical recurrences of danger that over-ride the wishes of helicopter parent types. Unlike predictable rites of passage, such as turning a certain age, or voluntary ones like going through boot camp to join the military, the rites of passage that young people undergo in times of rising violence rates are not clearly marked out for them ahead of time by the elders who have gone through it themselves, nor are they chosen freely. It's either sink or swim. Grown-ups see that and begin to back off of their over-protective instincts, letting them develop fully and naturally.

This dynamic appears especially powerful during the second half of rising-crime times, when the world starts to look more apocalyptic, and so the need for solidarity even stronger. During the 20th century, this corresponds to the Jazz Age (the later 1910s through the early 1930s), and the Terminator Age (the later 1970s through the early 1990s). For the period in between, and its counterpart since the early '90s, we shouldn't confuse the taking-for-granted of physical security with a bond of trust that was cemented through shared rites of passage.

In surveying the practices and institutions that bring age-mates closer together in a rite-of-passage kind of way, it looks strange to see that the most long-lasting of them were born in times where it looked like the world was going to end soon anyway, and so what would have been the point? Most successful religions, or movements within a given religion, started out that way, too, so perhaps it shouldn't be so surprising.

The easy way out in such an environment is to just abandon yourself to hedonism, so the fact that most people felt like coming together for group preservation is a testament to how much stronger our moral sense builds itself in response to a moderate level of danger in the outside world. Traditionalists may not like the fact that in this atmosphere the rituals that bond members together are newly invented, but then in such a topsy-turvy world, it's perfectly natural to feel ambivalent about the efficacy of older rituals that evidently were not enough to prevent the present disorder. And anyway, this round of trial-and-error experimentation provides the variation needed for successful new traditions to emerge.

Would you believe that an institution as apple-pie as the youth scouting movement arose during the era of bootlegging gangsters, hot jazz music, a wave of new age cults, and an epidemic of sex criminals and serial killers? That wasn't the only rite of passage that the Jazz Age has given us: add to it the joyride, parked-car make-out sessions, and the cocktail party.

Who knows which of the countless ones introduced during the late '70s through the early '90s will prove as successful 50 or 100 years later. I have a good guess, but I'll save that for a separate post.

July 18, 2011

Visual song lyric analysis of every #1 in 1987 vs. 2010

Earlier I looked at how prevalent certain themes and images were in the titles of #1 songs on the Billboard Hot 100, from 1959 through 2010 (search this blog for Billboard to see them all). For the most part, the evocativeness of hit song titles reflected the violence rate -- rising throughout the '60s and '70s, and reaching a peak sometime in the later half of the '80s, then declining or disappearing altogether from the mid-'90s onward.

Now let's take a look at the lyrics instead of just the titles. I chose 1987 and 2010 to compare peak and trough. See the footnote for detail on the methods. [1] Here are the word clouds showing the most frequent words across all #1 songs for each year (click to enlarge):

1987



2010


Based on the earlier studies of song titles, these aren't too shocking. The hit songs of 1987 had more concrete images -- eyes, shake, dance, run, chasing, spinning, burns, voice, body. They focused more on matters above a mundane level or refer to the passage of time -- heaven, faith, prayer, soul, forever, life, died. Natural and organic pictures abound, too -- earth, light, brighter, rain, sun. And our need to socialize comes through, especially in ways that are egalitarian and based on trust -- home, help, care, find/found, understand, friend(s), need, somebody, hold(ing) -- and how unnatural and therefore uneasy we feel hiding in a tiny social bubble -- waiting, alone, lonely.

In contrast, by 2010 the hit songs had more bland images -- looking, sipping, talking, playing, something. They focus only on here-and-now materialism -- popping bottles (i.e. breaking open expensive booze), sexy, plastic, ride (as in, showing off your car), hot, party, style, club, clothes. Nature bores them, except for the beach. And far from wanting to socialize on equal terms and enjoy themselves, today's audiences want to hear songs about self-aggrandizing attention whores who you only, like, wish you were as cool as. That doesn't come across very well in the word cloud, but that is the context for all of that bling-bling imagery. And in line with the pervasive trying-too-hard vulgarity, they have mainstreamed curse words -- fuck(ing), hell, damn, shit.

The 1980s were the opposite of the materialistic, artificial, and selfish "greed is good" picture that most people have in their heads. There were a very small minority who could've starred in Less Than Zero, or who resembled the caricature that Alex P. Keaton had been made into by the later seasons of Family Ties. Like a small dose of a pathogen introduced into the social superorganism, that fringe had the effect of triggering the moral and behavioral immune system of the larger majority -- now we see very clearly how not to live, how we should treat each other, and why.

This process had been underway since the '60s and '70s, when rising violence rates caused this greater variation in behavior, through mechanisms I've detailed elsewhere. Once the change in violence rates reversed, so did the other social changes that were influenced by them. Lacking any detectable major threats to their security and social relations, their moral immune systems have mostly failed to switch on.

An amoral people, such as the Chinese today or Americans in the later 1930s through the '50s, can still do OK if they have strong traditions that make up for their underdeveloped moral sense. Still, relying too much on traditions is like relying too much on antibiotics while letting one's own immune system languish -- what if the recipes or productive technologies for the antibiotics are lost or blocked or otherwise inaccessible? What if there's a new germ that the existing range of antibiotics can't deal with? Then the health of the individual and society can really start to careen toward the cemetery, and it only takes one Black Swan to wipe out the group. (This is when we really reap the benefits of trial-and-error learning, especially through social learning -- what did someone else do that did or did not work?)

Indeed, by 2010 the pimp-my-ride mindset had become so epidemic that even when we were years into a major recession, the #1 hit songs were still bragging about how great of a baller lifestyle the singers lead, and what a chump you are if you aren't poppin' bottles of Moet and Crystal. Back in the '80s the average listener would've recognized these sucker MC's for what they were -- a bunch of ghetto and trailer-park trash hootin' and hollerin' like they'd just won the lottery, rather than someone who'd achieved their fame and wealth through talent and excellence.

Such a reaction would not have been due so much to strong traditions -- somewhat -- but more to the recent boost to their own internal moral immune systems that they got from witnessing a rise in disorder. Like exposure to a pathogen, too much will overwhelm the body, but again it's not as though sociopaths and materialistic people were trampling over the majority. We have to learn to accept and even appreciate a slight but not dramatic rise in the violence rate, or else we'd be doomed to wander off the path even farther in our existential drift.

[1] Each year's long stretch of text included the song title, and a pared down version of the lyrics. I only put in the chorus once, and trimmed away other repetitious parts. Otherwise, if a single song keeps repeating a line, the picture for the whole year will make it look like a whole bunch of songs mentioned some keyword.

I had the frequency-counting program cut out very common words ("and", "to", etc.), and I removed others that perhaps are not super-common but still ordinary enough ("around", "make"), and that did not bring up some key theme (e.g., "always" and "tonight", which in context almost never treat the passing of time). I also took out slang addresses like "baby" and "boy".

Most importantly, I cut out "love," which does actually say something about what topics people heard songs about, but which so dominates all the other words that the word cloud is distorted by its imposing presence. And it's the least surprising result, so it's no big deal to mention it here and keep it out of the pictures. It's worth noting, though, that "love" was less dominant in 2010 than in 1987.

July 16, 2011

Funny contemporary ads

During the past 15 to 20 years, the trend in advertising has been toward the annoying and snarky. As long as they're going to try to be funny by ripping into others, it should be those who deserve it -- who are so assured of their greatness, despite being hollow receptacles for whatever their priestly caste feels is "it" at the moment.

Sadly the main targets are instead whoever The Other Guys are, like those Apple ads whose smugness is so chokingly disgusting that you find yourself rooting for a Dilbert dork for once.

One hilarious exception was the Geico caveman ads, which prompted viewers to ask, "Yeah, what kind of fucked up world do we live in where everyone is a self-important and thin-skinned faggot (especially the faggots)?"

Some of the Bud Light Real American Heroes ads are funny too, although the bulk of them are just part of the meta-ironic idiocy. (Like, huh-huh, wouldn't it be wacky to make an '80s rock anthem tribute to some poor brainless schlep like Mr. Movie Theater Ticket Ripper Upper?!??!!!?!) But a handful skewer some of the trends among the follow-the-lead-of-my-genius SWPL crowd, before the Stuff White People Like site was even up, and other non-partisan trends that still show how screwed up society is becoming.

Mr. Bumper Sticker Writer (probably the funniest one)

Mr. Tiny Dog Clothes Manufacturer

Mr. Stadium Scoreboard Marriage Proposal Guy

Mr. Cell Phone Holster Wearer

Mr. Really Loud Cell Phone Talker Guy

Mr. 80 SPF Sunblock Wearer

July 14, 2011

Heterosexuality declining among young women since 1992 or '93

Perhaps one reason that young females today are particularly loud cheerleaders of the movement to normalize gay deviance is that more and more of them have either said "what the hell?" to a same-sex adventure themselves, or know another girl who has. As far as I can tell, so far this possibility hasn't been explored very much, at least compared to the clearer trend of them having more gay male friends and acquaintances.

The General Social Survey asks three questions about the sex of the respondent's sex partners, allowing us to see what their objective behavior has been like, rather than how they subjectively perceive their sexual orientation. One asks what sex your partners over the last year have been (SEXSEX), what sex they've been over the past five years (SEXSEX5), and the number of female partners you've had since turning 18 (NUMWOMEN). The heterosexual responses to these questions are "exclusively male" for the first two and "0" for the third. I restricted the respondents to 18-29 year-old females, since the shift in support for queers seems to be most pronounced among young people.

Here is the percent of young women who gave the heterosexual response for each of the three questions, from the late '80s / early '90s through 2010:


The only anomalous point is the unusually low value for "numwomen" in 1989, which doesn't agree with the high values in 1988 and '89 for "sexsex". Aside from that, though, all three questions point in the same direction. They were still high through 1991, and no later than 1993 they started falling steadily (there was no survey in 1992, so that could have been the pivotal year). Against that trend, there was a brief detour away from bisexuality and lesbianism from 2002 through 2006. Ironically, that was the peak of the girl-on-girl attention-whoring culture, which as I pointed out earlier was just a goofy stunt to get maximal attention from guys while giving them nothing in return.

It's not as though the average girl is no longer straight, but still the percent has fallen from nearly 100% to around 90%, depending on the question. That may not sound huge, but to make it easier to get on a gut level, let's pretend that how heterosexual a young woman is was like how tall someone is, a kind of "height." Just like some are taller or shorter, some are more or less heterosexual. Female sexuality is more fluid, especially during the more experimental stages of life. (This is not true for men, who are either gay, straight, or lying. And indeed none of the three questions show a decline in heterosexuality among young males.)

Taking the young women of the late '80s / early '90s to be the standard, it's as though their counterparts in 2008 were 2, 2.8, or even 3.2 inches "shorter" on average, depending on the question. Think about that -- the height of the average young woman falling 3 inches (or 1 standard deviation) in just 18 years, not even a full human generation. That is a change big enough to affect large swaths of the culture.

For one thing, even straight girls now are about 10 times more likely than 20 years ago to know a girl who's fooled around with bi or homosexuality. So, suddenly the topic of queer rights strikes a lot closer to their narrow social circles, making it a political topic that they will actually invest in.

There is a separate question of what caused this decline of female heterosexuality in the first place, remembering that there is no similar upward trend for gayness in young males. I'll write that up more in another post that will include a look at more sources of data, but briefly I think it's just one effect of the falling-crime times pattern of male and female social worlds separating more from each other.

When boys and girls don't interact much with each other, the males will not resort to homosexuality because the instinct against it is so great that it takes an environment like prison, English boarding schools, the ancient Greek army, etc., to nudge them in a gay-for-a-day direction. Since female sexuality is more fluid and facultative, though, I can see how they'd respond to their self-imposed exile from boyland by experimenting more with their fellow girls.

The previous era of falling crime, 1934 to 1958, witnessed Kinsey's report Sexual Behavior in the Human Female, which had estimates of around 10% of 20-35 year-old women being bi or homosexual. That's a rather high estimate, but it was published in 1953, a time when girls had stopped being so boy-crazy and mostly hung around other girls. Girls during periods of rising crime, like the Jazz Age or the age of rock 'n' roll, are too busy chasing after boys for dates to bother even thinking about girls, let alone acting on it. So many cute boys to flirt with at the mall, and so little time. But again, I'll look at this sometime later.

July 13, 2011

Beer Pong... uh, and like, Society

In the New York Times there are no references to beer pong as a common game until 1999, and Google Ngrams shows that the prevalence of the term in its digital book collection was flat and nearly invisible until the second half of the '90s, when it began exploding in popularity. The first time I remember it first-hand was freshman year of college, 1999-2000, and had never seen it at any party in high school or middle school, or when my babysitter had friends over for a drink when I was little, or in any movies or TV shows where there were young people drinking.

Such a widely practiced social event that rose so high in popularity so quickly asks for an explanation. In particular, how does what looks like a glorification of drunkenness fit into the larger context of plummeting wildness among young people during the past 20 years?

First, it is highly formally structured -- it requires a surface of a certain size, cups set up in a certain pattern, and a set of rules for who goes when, what to do when the ball goes into an empty cup, etc. Highly structured activities inhibit the mind from diffusing its attention lens, getting lost, and going with the flow, wherever that may lead. It may not be like signing in at the doctor's office, but given that it is supposed to be a party, you might as well have the kids' mommies there to enforce a minimum distance between bodies that otherwise might get too close.

Second, it has become a spectator event, where everyone else present looks on at the game, rather than only the players involved paying it any mind. They stand in a circle around the table, form a broken circle along the wall if it's a small room, sit side-by-side on a nearby couch, or something else. All of the formations show each individual in the audience more or less disconnected from one another and turned toward the game, only occasionally turning to someone nearby to comment on the game -- "Oh damn, that's three in a row, hope he can handle his liquor!" -- rather than socialize about whatever is on their mind.

That's different even from when I first saw it just over 10 years ago -- it is a horrifically boring sport to watch, so no one else gave a shit back then and kept drinking and talking to whoever they were with. There were only two players, one on each side, instead of the now usual two-on-two, because they couldn't convince anyone else there that it was worth playing (or watching). It was also called Beirut, an exotic and somewhat evocative name that made no logical sense to the Aspie youth of today, and was therefore given the more transparent name beer pong.

At any rate, the spectator nature of the game keeps most of the people at the party from being active in any way, and instead passively observing some stupid drinking game. Nor does the fact that they are all huddled around the table mean that there's a strong sense of group cohesion. Since the action is not very exciting, it's like a bunch of people spread among a number of couches all watching the same boring TV show.

Third, it prevents people from wandering off to escape public attention. Most kids aren't comfortable enough to boldly flirt and make out, let alone have sex with, someone else in the presence of the rest of the group that was intensely fixated on ping-pong balls bouncing into cups of beer. They need to go somewhere more private and intimate. So then this also relates to the spectator aspect: if the norm is that everyone has to stay around and watch, it'll be poor form to pair off with someone you like and go talk, touch, or whatever, in some room away from everyone else. You'll be ostracized for trying to have a life.

Finally, the game has become so commodified and co-opted by now that it has lost whatever iota of carnivalesque potential it may have had 10 years ago. You can now buy specially designed tables for beer pong, and there is even a World Series of Beer Pong. I know that only a minority of players use one of these special tables, or would consider entering a professional tournament. But even that is too much -- there were no special-purpose toiletpaper brands for wrapping someone's house, and no championship to see who could T.P. a tree the fastest. Same goes for spin the bottle. There were no bottles marketed just for that purpose because it never ossified into an empty ritual. Only at that point do people stop caring about the social interaction itself and turn to consuming products aimed at the in-crowd, in a vain hope to recover the earlier feeling of community that couldn't be bought or sold.

Formal, structured, commodified, ritualistic activities are all well and good in certain times and places, and among certain groups. But a party is supposed to level ranks somewhat, bring community members ("peers") close together, build trust between them by putting them in a slightly vulnerable position and coming out OK, and easing up on the authority that groups usually exercise over their members' behavior, for example by not following the couple who pairs off -- back in ye olden days, even if a girl belonged to a stable clique, they would not circle the wagons or send the domineering fat one to cockblock the boy she was flirting with.

But cockblocking as a sign of how authoritarian young people's social groups have become is a story for another time...

July 12, 2011

Fashion in baby girl names predicts the murder rate, may reflect trust level

Over the past two years I've detailed lots of not-coincidental pairs of changes in the violence level on the one hand and some other social or cultural variable on the other. So far the link between them has always been easy to understand.

For example, rising-crime times cause people to believe in, or at least focus on, the supernatural, one clear signal of which is the proliferation of religious cults in those times. In falling-crime times, this interest evaporates and cults fade out. The link is simple: when people perceive a more dangerous world, the order of the universe appears to be breaking down, so they lift their eyes up to see what might be going on in places above the mundane level that could explain what the hell seems to be going wrong here on the ground. Then when they perceive a safer and safer world, they figure that order is being restored somehow or other, so we can return our attention back to more sublunary matters.

This time, though, I don't know quite what to make of link between the two. The hunch came from thinking about how different girls' names sound today compared to when I was little, and not just my female peers but also the older high school girls who babysat me or my friends, plus female characters in TV shows or movies. It seemed like back then they were more likely to have a name ending in the long "ee" vowel. Shelly, a girl my age who my mom made me play with when I was 3. Heidi, my babysitter in kindergarten (my best friend's was named Susie). Nancy, the lead character in A Nightmare on Elm Street. Kelly, the sweet babe from Saved by the Bell. Shirley, my first girlfriend in 6th grade. And that sound seemed more absent from the names of today's younger girls, aside from a handful of trendy names all rhyming with "--aley" or "--iley" such as Haley, Bailey, Caylee, etc., and Miley, Riley, Kylie, etc.

By now, I've seen so many changes fit with changes in the violence rate that I'll pursue any hunch if there's even a semi-plausible link. Here it is that names with the long "ee" vowel sound cute. Cross-linguistically, vowels that are produced high and front in the mouth, such as "ee," tend to occur in words relating to tiny size, light force, brief duration, and diminutive status in general. In English, we have a long "ee" diminutive suffix: kitty, doggie, blankie, tummy, and so on. Having looked over the cycles in how much make-up girls wear during rising or falling-crime times, I bought the idea that cuter names would be given to baby girls in those times too.

The Social Security Administration's website has data on what the most popular boys' and girls' names were in some year, including what percent of newborns had a certain name. From this I constructed an index of how prevalent it was for newborn girls to have a name ending in the long "ee" vowel, which I explain in a little more detail in a footnote for those interested. [1]

Here are the name index (in blue) and the homicide rate (in red) from 1900 to 2009 (click to enlarge):


Given how noisy the estimates of these two variables are in any given year, let alone for 110 years, that is an amazingly tight fit. Think of how much else you might have thought would influence naming fashions -- and of course a lot of people say that fashions don't reflect social changes at all. The correlation between the two variables across years is +0.504 (2-tailed p = 1.94 x 10^-8). The only time where they don't move together is roughly 1900-1915.

By mid-century, it looks like the cuteness of baby girl names actually predicts the homicide rate later on -- talk about a cheap way to forecast the near-term crime rate! Sticking just to the most recent crime wave, from 1960 to 2009, the homicide rate in a given year could be predicted from the name index 4 years earlier. That correlation is +0.577 (2-tailed p = 1.00 x 10^-5). Look how even the brief down-tick in the homicide rate in the early 1980s was foreshadowed by a drop in the name index during the mid-'70s, and that both rose afterwards for roughly 10 years before dropping very fast during the '90s, and then more shallowly during the 2000s. Both also show a sharp rise at first, and then a plateau.

The only other variable I know that is a leading indicator of the crime rate is the level of trust that people have in one another. That rose through the '70s and '80s, peaking around 1989 -- 3 years before the peak in the homicide rate in 1992. So the cuteness of baby girls' names more likely reflects current trust levels, which are closely related to violence levels but precede them by roughly 5 years. This name index, then, could serve as a proxy for trust levels even for times when we don't have good survey data.

Perhaps in more trusting times, parents expect their daughters to grow up in a more chummy peer environment, so they worry less about giving them a cutesy name that would be more of a liability in a more adversarial world. Imagine a girl introducing herself with a sweet-sounding name when most people around her look on sweetness with irony, sarcasm, or disgust.

If I were her parents, I'd probably give her a more suit-of-armor kind of name too. Not necessarily a tough-sounding name -- although there has been a rise in androgynous girl names in the past 20 years (Taylor, Jordan, MacKenzie, etc.) -- but at least one that sounded better-than-you or more-special-than-you, like Sophiabella. That would keep other kids from trying to make fun of her name.

So add that to the list of ways that higher-trust times are more enjoyable -- girls wearing prettier names. And although there is a cycle here, the long "ee" names in one wave generally aren't dug up from the previous wave. They still sound fresh. Betty, Peggy, Dorothy, Beverly, and Ruby did not make it from the Jazz Age peak into the next peak during the age of rock 'n' roll. Likewise, in the next wave we probably won't meet too many girls named Amy, Becky, Katie, Stacy, or Kimberly (my favorite -- it also has a pair of warm, maternal bilabial consonants, "mb"). The next wave will bring with it a totally new form of wild popular music, distinct from jazz and rock, and another crop of girls with pleasantly novel "ee"-ending names.

[1] I went through the top 200 names, found those that ended in the long "ee" vowel, and added up the fraction of newborn girls with such names. However, there are differences over time in what fraction of girls are represented by top 200 names -- sometimes girls' names are concentrated among fewer choices, and sometimes they're more spread out among a variety of choices. So by looking at the top whatever-number, you're not necessarily looking at the same fraction of girls overall.

To correct this, I standardized by dividing by the fraction of girls represented in the top 1000 names. Why not stick with the top 200? I already had calculated the top 1000 figure, and they take awhile to do. Its only purpose is to give a hint of how concentrated vs. diversified individual names are among the choices, so it'll work as well as if I did it for the top 200.

July 10, 2011

Even easy listening music was better in turbulent times

After listening a lot to a mostly '80s internet radio station for several weeks now, I'm reminded of how good even the filler music used to be. Well, that's what we used to call it anyway. I think the technical name is adult contemporary -- lighter, play-in-the-background tunes that are soothing but still upbeat, a popular music version of Mozart.

Just as nobody listens to Beethoven all day long, or reads Shakespeare all day long, or watches David Lynch all day long, nobody who isn't pretentious would listen to the more powerful, body-moving popular music all day long either. Maybe if you just listen to music for a short time, you could concentrate only or mostly on those more body-possessing songs. Otherwise, though, you'll need some good filler songs that will heighten your appreciation of the more gripping songs, while still being catchy enough to keep your mind switched on to music-absorbing mode in the meantime.

If easy listening were an alternative to great rock or R&B or dance-pop music -- mere "dentist's office music," to dig up another phrase from memory -- then the quality of easy listening and of those other styles should show opposite trends over time. As rock, R&B, and dance-pop fall from greatness, that leaves a niche to be filled by less ambitious styles like adult contemporary.

But looking through the list of AC #1s from 1961 through 2011, the best easy listening songs were made when the best songs in other genres were made, roughly the '60s through the '80s, but mostly the later '70s and the '80s, with the '60s and earlier '70s serving more to pave the way for better creators. This makes sense in light of the view above that the lighter and the more powerful songs are complementary pieces that interlock within a richer flow of music, like male and female, and are not competitors against one another. So, easy listening songs will be best when they are meant to complement great pop music in more toe-tapping genres.

I won't bore anyone with a reminder of how bad it's gotten in the past 20 years -- Celene Dion, Backstreet Boys, John Mayer, Jason Mraz, etc etc etc. Like rock and R&B, the level of passion just isn't there -- pretty pathetic given how low-key easy listening music is supposed to be. This more sedated mood fails to turn the brain on at all, so it can't even function as a place-holder in between more energetic songs.

It looks like the peak years of adult contemporary were 1983 to 1989. Like rock, R&B, and dance-pop, there wasn't a whole lot going on in the very early 1980s. Unlike the other styles, though, it seems to have died for good a couple years earlier. At any rate, here are 15 classics worth padding out the length of your music-listening hours, taken just from the AC #1s:

"All Night Long (All Night)" by Lionel Richie

"Time After Time" by Cyndi Lauper

"Drive" by The Cars



"Careless Whisper" by Wham!

"Rhythm of the Night" by DeBarge



"Everytime You Go Away" by Paul Young

"These Dreams" by Heart



"Live to Tell" by Madonna

"In Too Deep" by Genesis

"I Wanna Dance with Somebody (Who Loves Me)" by Whitney Houston

"Got My Mind Set on You" by George Harrison

"Shattered Dreams" by Johnny Hates Jazz

"1-2-3" by Miami Sound Machine



"Waiting for a Star to Fall" by Boy Meets Girl



"Eternal Flame" by The Bangles

July 9, 2011

The effect of AIDS realism on containment of gay deviance

By now most straight people have either heard, read, or just noticed with their lying eyes that AIDS never was a problem for heterosexuals, and continues to be a non-issue today.

That is a totally different view from the mid-'80s through the early-mid-'90s, when it was new and frightening. Could you catch it from a public toilet seat? What about if some kid with AIDS went to your school and you shook his hand? And what if...? I still recall my schoolmates in fourth grade singing "Let's Talk About Sex" by Salt N Pepa, which tried to shock the straight listeners into taking safe sex seriously -- you don't want to be claimed by the AIDS epidemic, do you?

Over this same time, straight people have come to view AIDS as not a real threat to gays, and have therefore abandoned any duty to contain gay deviance ("oh puh-leeeease, what's the worst that could happen?"). This is just one of the many ways in which people have lost their former awareness of how messed up gay life is.

That seems strange -- shouldn't they have come to see AIDS as a uniquely gay threat? I think what's going on is that straight people are using themselves as a reference point or baseline to which they compare the danger to gays, and react accordingly to contain gay sexual behavior, give it the go-ahead, or shrug in puzzlement.

Here is a table of how straights react based on their beliefs along two key dimensions:


If straights perceive AIDS as a danger to themselves, it doesn't matter whether they think gays are just like us or are more deviant -- AIDS is at least as much of a danger to them, perhaps more. In either case, at least the same level of alarm will be sounded about gay behavior as about straight behavior (and recall that was quite an alarm in the '80s). Obviously the impulse to contain gays will be even greater if they are seen as even more deviant than straights, but that's more a difference of degree.

However, if straights perceive AIDS as not a problem for themselves, and also believe that gays are just like us, then they give the green light to gays to act however they want. If it's not wrecking us, why would it wreck them?

Even if they believe gays are more deviant, they don't have a good intuitive feel for just how much more deviant they are -- a little, somewhat, a lot, a whole big lot? All they sense is that AIDS is a bigger threat to gays than to us. But is that threat enough to warrant a policy of containment? Some will say it is and others that it isn't, squabbling over how dangerous is sufficiently dangerous.

Over the past 15-20 years, our society has moved farther down and to the right in that table. Still, it seems like the main cause of our turning a blind eye to gay self-destruction, and waving it on eagerly from the sidelines, was our more sanguine view of the threat that AIDS posed to us. When we ourselves were freaked out about it, that provided a powerful mental "anchor" that automatically made us see gay behavior as worrisome (if they're just like us) or downright frightening (if they're more deviant).

Unfortunately this suggests that the surest way to get back on track of keeping gays from destroying themselves and polluting the broader society is to send a good scare into straights about AIDS or some other STD, and that arguing over whether gays are just like us or are deviants might not prove very effective. Note that the panic among straights would not have to result from lying about the prevalence of AIDS among straights -- you can imagine two fairly similar groups reading over the same prevalence estimates, and one freaking out while the other shrugged its shoulders. It's the perception of the objective data that matters.

Since straight people have turned off the alarm as they figured out what the prevalence truly was among us, I doubt that we could be scared back to where we were 20-25 years ago. Sadly, I think it will take something like the successor to AIDS to break out among gays, just as the original was new and terrifying even to straights in the 1980s and early '90s. And who knows, we may have more than an unfounded fear the next time around -- no two pathogens are alike, so maybe AIDS v.2.0 really can be caught from public toilet seats or bathroom door handles.

July 8, 2011

Fewer rites of passage, dead field trip edition

This summary is not available. Please click here to view the post.

July 7, 2011

Footloose trailer and music video (the re-make)

There's now both a trailer out for the re-make of Footloose and a music video for the song "Fake ID."

God, are they really thinking of re-doing not just the movie but the soundtrack too? Watching the trailer and the video, I heard a cover of the original "Footloose" song, but the rest was entirely new songs -- rap, emo, and fucking country. Probably the least do-you-wanna-dance? genres around. I understand that they want to make it more contemporary, so put in some mid-2000s pop music -- that was OK to dance to: Franz Ferdinand's "Take Me Out," Madonna's "Hung Up," OutKast's "Hey Ya!"...

It looks like I guessed correctly before about how there would be almost no sustained, close-in dancing or slow-dancing. It's either solo or standing lapdance -- hey-look-at-me stuff all around. Do the makers of this movie even know what "cut footloose" means? It means losing your self-consciousness and melting into a group-vibe, not individual contestants posing and whoring for attention.

Yeah, we know that this is how the dorky and anti-social young people today dance, but this movie would have been a good opportunity to show the audience what they're missing by showing only or mostly face-to-face dancing between people within each other's personal space, just letting go.

I'll probably end up seeing it for sociological research purposes, but for those who can already smell a bad movie approaching, here's a better and condensed version of the action and music (yes, that is totally the chick from Halloween):

July 6, 2011

Why, despite higher average IQs, aren't Jews and Asians world-class social scientists?

Compared to the European average IQ of 100, Ashkenazi Jews are about 1 standard deviation above, at 115, and Northeast Asians are at about 110. If we imagine IQ as a type of "height," and let's say that the average European male was 5'10, it would be as if Jews were on average 6'1 and Asians 6'0. This IQ advantage makes these two groups over-represented in the hard sciences, where braininess makes a big difference in who succeeds vs. who doesn't.

But what about the social sciences? They're not as "hard," but they still require an ability to look at a bunch of stuff and draw patterns out of it, to lock several of these basic patterns into a single more abstract pattern, and so on. So, groups with higher IQ should be over-represented in these fields as well, all else being equal.

And yet there's virtually no major thinker in the social sciences who was insightful, original, broadly visionary and who was also of Northeast Asian background.

Jews are certainly over-represented among major thinkers in the social sciences (see some of the lists of influential Jews in various fields at JINFO). But just about all of the big deals were either nutcases or frauds -- Marx and Freud being the most egregious examples. Slightly lower in overall influence but still joining those two are Boas in anthropology, Derrida in philosophy, and the leaders of neoliberal economic physics-envy.

The only major one who strikes me as insightful, original, and whose vision touched an incredibly broad array of topics was Durkheim in sociology and anthropology. I'm not counting those who did good original work in a limited field, but whose results weren't like a revelation, for example Asch and Milgram in social psychology. Chomsky hits all the marks too, but I don't see that work as part of social science, as he studies properties of an individual's brain or mind without attention to a larger social context. And his linguistics work is more like that of a molecular biologist than a zoologist. I can see how others would include him, though, so that makes one "for-sure" and one "possible" exception to the trend for major Jewish social science thinkers to have come straight from the loony bin.

A major confusion when talking about the disproportionate number of Jews in crazy circles is to think that they're just over-represented everywhere requiring high IQ -- including the sane and brilliant circles. Again I just don't see that when going through the lists at JINFO or thinking off the top of my head. I know there are exceptions at lower levels of accomplishment, but I'm talking really breakthrough figures.

Why do these brainy groups not dominate the social sciences like they do the harder sciences (intellectually, not numerically)? Clearly high IQ is not enough -- you also need a good intuition for how human beings work inside and among each other. I trace the underwhelming performance of these two groups to the ecological niches that they are adapted to, which have relaxed the selection pressures for keen social intelligence and thereby redirected selection to improve other traits.

Asians just seem not to get people at all, not that they have very clear and crazy views about people. For instance, they are poorer than Caucasians at reading faces for basic emotional expressions. They tend to blur "fear" and "disgust," as a result of focusing too much on the eyes and not taking in information from more of the face. This seems like a dialed-down social intelligence, one that sees more noise around the signal, not one that mixes up, say, "fear" and "joy." Since they don't perceive as much about people, they just don't get very interested in social sciences to begin with, or acting, stand-up comedy, rock music, and so on.

What relaxed selection for social intelligence in Asians, and redirected the body's resources into other traits, is the intensive agriculture that they have practiced for nearly 10,000 years. Really intensive agriculture brings hierarchical social structures along with it, to coordinate a highly complex system. For example, you might need to irrigate the land. Small-scale gardeners (horticulturalists), such as those in Sub-Saharan Africa, can just send a woman out with a spade to dig up the ground, plant stuff, and have it grow without needing to divert a river. To thrive in such a top-down society, you shouldn't try to figure out how people work -- you just need to do what you're told so that the whole big machine runs smoothly.

And in your hour-by-hour work, you're engaged mostly with a bunch of stuff that does not have a mind of its own -- your tools, the ground, seeds, plants, and so on. Smaller-scale gardeners, doing less intense work, have plenty of time to socialize. Hunter-gatherers ditto, plus they have to get inside the mind of the animals they're hunting. Pastoralists (animal herders) also have to be able to sympathize with the needs and feelings of their livestock in order to be the best shepherd. As for interactions with people, herders live in far less authoritarian groups, so there's a good deal of buddy-buddy among the in-group members. And even when they encounter someone from the out-group, the interaction takes the form of a showdown in a lawless borderland, where it pays to be able to get inside his head.

All of that is a long way of saying that groups adapted to intensive agriculture will have an impaired social intelligence, such as among Central Americans and East Asians. It's not that their social sense is haywire or biased -- just out-of-focus. Hunter-gatherers would probably make good social scientists if they were smarter, but they tend to have the lowest average IQs, and small-scale gardeners are only somewhat smarter. The key group to have in your population in order to make good social science is pastoralists -- as a rule-of-thumb, you find them wherever at least a good-sized minority can drink milk, from the Northwestern Indian subcontinent through the Middle East and Levant, around the Mediterranean, and into Europe. Although animal herding is a more recent development in East Africa, I'll bet they'd make good social scientists too, compared to other Sub-Saharan Africans.

That's Asians. But what's the deal with Jewish social science? They were restricted to a white-collar occupational niche for close to 1000 years in Europe, serving as tax farmers, money lenders, and other financial and commercial roles. That removes them even further from normal chummy social interaction than does intensive agriculture, but unlike Asians they have produced very clearly articulated, very influential, and very nutty pictures of human nature. Where does this vision that is not fuzzy but clear and almost hallucinatory come from?

Well, if you were a tax farmer or money lender in Medieval Europe, you did not enjoy all the protection of the modern state and market economy. If you made a bad decision, you could have gotten wiped out -- no welfare state to bail you out (in fact, they would have cheered a banker going bankrupt). Some of these bad decisions could be asocial -- like you forget to carry the 1. But a good deal would have been social, for example if you gave the other party the benefit of the doubt and they ended up screwing you. Unlike Asians, then, Jews had fairly frequent interactions with other people, but they were mostly adversarial in nature.

Since even a small miscalculation in that case could prove disastrous, it is more advantageous to set a low threshold for the "nah, that person stinks" detector. In those occupations in those times and places, better to assume that everyone else was stupid, selfish, and treacherous. That way, you'll make sure to do the calculations yourself and to not get taken advantage of by deadbeats.

Imagine if a group of mostly endogamous people were only allowed to work as guards and wardens in the prison system -- would it pay in Darwinian terms for them to have a rosier or darker view of human nature? I would not expect accurate and insightful social science to come from such a group. When your social interactions with your fellow man are mostly adversarial and in the context of a power imbalance that favors you, you aren't going to see human sociality very clearly -- indeed, you will behold a distorted reality, seeing phantom selfishness where there is none, and having a blind spot to the common sense that makes your charges smarter than you think.

Only when you frequently relate to other people on equal terms, experiencing both their good and bad tendencies, will you have a clear basic picture to work with when kicking ideas for social science around your head.

July 5, 2011

The waning interest in extreme types: height and looks

The post below on the disappearance of midgets from pop culture walked through the basic links from rising violence rates to a greater interest in the unusual. The other side of the no-more-midgets pattern is of course the disappearance of giants. Chewbacca, Jaws (the James Bond character), the Predator, Andre the Giant's character in The Princess Bride, Sloth from The Goonies, Bigfoot (where did he run off to anyway?), the giant in Twin Peaks ... we haven't seen much of a fascination with freakishly large people during the past 20 years -- CGI, etc., do not count because only a real-life human that big is a cause for wonder.

What about attractiveness? Here too you don't see the same attention to the farther-away-from-average folks. At the far ugly end, there was Screech from Saved by the Bell and Martha Dumptruck from Heathers. And at least until she starts to get made over, Sissy Spacek in Carrie is one of the few characters whose mere sight turned my stomach during a movie. She's not fat, doesn't have a horribly asymmetrical face, but there's something about her skin that just gave me the willies. The one who takes the cake, though, is that ranting crone from The Princess Bride. ("Rubbish, filth, slime, muck -- BOO! BOO! BOOOO!")

These are characters whose physical repulsiveness is necessary for the role, not who just happened to be played by ugly people. Aside from that girl in Precious, no one seems interested in studying what life's like at that extreme. It's not a real exception since it was never very popular, but the indie Welcome to the Dollhouse from 1995 also had an ugly girl in the lead role of awkward teenage misfit Dawn Wiener, unlike the cute chick who played Juno.

At the opposite end, the drop-dead gorgeous babes have all gone extinct as culturally visible types. Here is a somewhat recent Joel Stein article on the fall of the supermodel from the early-mid '90s through today. Women chosen only for their beauty no longer appear in ad campaigns, on the covers of magazines, and all the other places that people used to assume models would dominate forever. They've been replaced by actresses, TV show hosts, singers, etc. Even their most attractive members won't be the best-looking because they must also be able to act or sing decently, whereas models (used loosely) are free of that constraint, although they do have to be able to express a range of emotions.

Just like the characters whose ugliness is integral to their role, others require a mega-babe for the part. In both Vacation and Christmas Vacation, how are they supposed to tempt a gung-ho family man away from his wife, who is already attractive, unless they introduce a woman so stunning that we understand what made him give in? He's so committed to his wife that nothing less than a rock video vixen type could make him seriously consider stepping out. A woman like Cameron Diaz, Katy Perry, or Megan Fox does not have looks overpowering enough to fry the rational circuits of Clark Griswold's brain when she asks, "Well, are you gonna go for it?" ("This is crazy, this is crazy....")

In the past 15 to 20 years, we have gone back to the previous era of falling crime, the mid-'30s through the late '50s, when the popular sex symbols had already made a name for themselves in acting, singing, or something else. Bettie Page was an exception; most of those pin-ups were of established actresses, including the most iconic pin-up of Betty Grable.

Similarly, the last peak period of model visibility, the '60s through the '80s, was like the earlier era of rising crime, circa 1900 to 1933. It's true that the sex symbols of the Jazz Age like Clara Bow and Louise Brooks were actresses, but they didn't really act, being silent film stars. I'm sure that required more beauty-irrelevant skills than a model's job, but they were still seen and not heard. They were not known primarily for their competent or excellent acting skills -- meaning those that involve verbal communication -- and incidentally for their good looks. If she was a knockout and could express a range of emotions, that's what counted for her to become a sex symbol or style icon, not unlike the requirements for a supermodel.

And it's not as though the Jazz Age wanted for a supply of real actresses and singers from whom to draw sex symbols. Mass-market popular music had taken off, and so had radio dramas and comedies, not to mention live acting. It's just that, when the goal is to produce a sex symbol, why require them to be skilled at acting, singing, etc.?

Hopefully by the next time the crime rate starts climbing again, the whole country won't be bloated with obesity, and we'll get to enjoy another Jean Shrimpton, Cheryl Tiegs, or Cindy Crawford.

July 4, 2011

Swimsuits that gave girls legs for days

Sometime later I'll write up a proper analysis of how underwear and swimsuit shapes have responded to changes in the violence rate, as part of the larger project on how appearances respond to societal changes. For now I'll just note that progress in how attractively girls make themselves up can easily be lost -- and thankfully recovered -- because of the cyclical nature of fashion. However, it is not due to the "wheel of fashion" that drags along even unwilling individuals, that changes only in order to change. Rather, fashion changes in response to broader changes in society and shifts in people's mindsets, worldviews, behavioral strategies, and so on, the primary driver of which is the trend up or down in the level of violence.

One unfortunate casualty of the past 20 years of falling crime has been super-high-rise underwear, which, as the model from Christmas Vacation is pointing out, shows the full line of the leg from the thigh through the hips and all the way up to the waist. An artificially straight-across line breaks up the natural flow of the curves in the hip area, especially the diagonal line of the pelvic bones that get interrupted by horizontal waistlines but that are there in all their natural beauty when the waistline is cut very high.

Of course this change is much more visible in swimsuits, which follow underwear in fashion but are more public. Whether you were at the beach or just browsing the Sports Illustrated swimsuit issue, you couldn't have helped but notice how dead the leggy look has been. In fact, the only famous image of a recent celebrity posing in a hip-bearing swimsuit -- and a neon green one, no less -- is of Borat. (Here he is with several girls, all of whose swimsuits are cut straight across.)

It seems only fitting on Independence Day to honor the multicultural and immigrant origins of those who have improved our native culture as sex symbols. So here to revive the "she's got legs" look are a Czech, an American with Greek and Turkish ancestry, and an Australian. God Bless America. (Click to enlarge -- oh just go ahead!)

July 2, 2011

And now for something more refreshing

I can't leave a post about attention whores at the top of the page for long -- it'll just feed them. So how about a musical interlude from a low-point in the attention whoring cycle.



July 1, 2011

How does the girls-gone-wild culture reflect the prudification of society?

From Time Magazine:

Says a Minneapolis priest: "The young American male is increasingly bewildered and confused by the aggressive, coarse, dominant attitudes and behavior of his women. I believe it is one of the most serious social traits of our time-and one that is certain to have most serious social consequences."

The shrieking blonde ripped the big tackle's shirt from his shoulder and Charlestoned off through the crowded room, fan-dancing with a ragged sleeve. In her wake, shirts fell in shreds on the floor, until half the male guests roared around bare to the waist. Shouts and laughs rose above the full-volume records from Gentlemen Prefer Blondes. The party, celebrating the departure of a University of Texas coed who had flunked out, had begun in midafternoon some three hours earlier. In one corner, four tipsily serious coeds tried to revive a passed-out couple with more salty dog (a mixture of gin, grapefruit juice and salt). About 10 p.m., a brunette bounded on to the coffee table, in a limited striptease. At 2 a.m., when the party broke up, one carload of youngsters decided to take off on a two-day drive into Mexico (they got there all right, and sent back picture postcards to the folks).

That's from the original 1951 article on the Silent Generation, although it sounds remarkably familiar. Unlike the femme fatale of film noir, or the "Whatever Lola Wants" type, young girls from the '60s through the '80s were definitely not "aggressive, coarse, dominant" -- that was the era of the high school sweetheart type, from Marcia Brady to Mallory Keaton to Kelly Kapowski.

Also familiar is the semi-public striptease and the general atmosphere of girls gone wild. Sex is a private affair, so when young people are truly more sexual, they play Seven Minutes in Heaven, where the maker-outers enjoy the privacy of a dark closet, or they pair off into unoccupied rooms in the house, the back seat of someone's car, or other place where they can slip out of public view. Spin the Bottle involves public sexuality, but hardly -- just a kiss.

Aside from the streaking fad -- and I think a good deal of them were males anyway -- there was not a whole lot of attention whoring based on sex appeal during more sexual times. Here we have to distinguish sluts, teases, and attention whores. Sluts make themselves up to signal an easy sexuality, and they follow through on that promise. Teases also sexualize their appearance, and although they won't just give everything away up front like a slut will, they will still give something back to the guys who approach them, even if only a little. They are also flirty and social, and they have a boy-crazy sort of mind.

Attention whores, though, use their appearance and behavior to draw in the looks of a large male audience, but they don't give anything back to the clueless dorks who follow them around like it's the first girl they've ever seen. They don't get a rush from the playful back-and-forth like teases do, as they are not very interested in interacting with boys in the first place. They're only out to get lots of easy attention and validation without having to put out for the boys, flirt with them, or even be physically near them -- just within eyesight. The majority of girls dancing around on YouTube or in night clubs (these days anyway) are attention whores, for example.

This also explains the whole girl-on-girl fad at parties during the housing bubble euphoria. Most of us, me included, looked at that and thought, "How slutty is our society becoming?" But now that we know that young people have become steadily less sexual since a peak sometime in the late '80s or early '90s, it makes better sense -- what easier way to get maximal male attention while not having to interact with them at all? Just kiss your bff, or maybe have her grind her ass in your lap, in front of a large crowd.

Back when kids were still sexual, a girl at a party wanted to talk to, make out with, or sleep with a boy -- and what would making out with another girl in public do to achieve that goal? So instead, they sought out boys to flirt with or sent them a signal to come over and make the first move.

The tendency for boys and girls to move apart from each other during times of falling crime therefore shifts the distribution of female sexuality away from the slut-and-tease direction and toward the attention whore. The tease is the ideal for society -- young girls are going to crave attention no matter what, so it's best that they be somewhat choosy and cautious while still being flirtatious and social. When the distribution has more teases, its extreme will include more sluts, but a handful of sluts is far more preferable to a mass of attention whores. Sluts more or less keep their business to themselves, whereas attention whores give off public pollution by the ton. And those handful of sluts aren't going to destroy society, whereas the off-putting behavior of attention whores just grates away further at the falling trust levels during safer times.

And while a girl who sleeps around is somewhat of a sexual deviant, she still seems mostly recognizable as a human being, just one with poor impulse control or something. The complete anti-sociality of the attention whore, though, not to mention the public displays of omg look at those two chicks in thongs making out!!!! that they use to get their attention, make them even more bizarre specimens and harder to sympathize with.

When falling violence levels push the sexes apart, probably because females feel less of a need for males then, the civilizing effect that we have on each other degrades. Look at men who have withdrawn from interacting with girls -- their whole lives just break down, as they hide away in their man-cave all day. The counterpart to the extreme male tendency of "just leave me alone with my stuff" is the extreme female tendency to pose and prance around just to soak up attention without giving anything back, like some parasite.

If you're not already, it's time to get comfortable with a higher level of violence than we currently have, unless you prefer a world peopled with coarse and pushy attention whores.