2009 | 2008 | 2007 | 2006 | 2005 | 2004 | 2003

TV Sex Study Sexed Up
November 18, 2005
Kaiser Family Foundation findings ignore their earlier study to make a more dramatic point

It seems like sex is everywhere on TV these days, and a new study just made headlines by proving it scientifically. In a study released Nov 10 at a Washington DC press conference, the Kaiser Family Foundation (KFF) reported that the amount of sexual material on television has nearly doubled since 1998. Predictably, this led to a slew of newspaper headlines from “TV’s Steamy Stuff” to “What's on Tonight? Sex, Sex, Sex, Sex, Sex, Sex.” But on closer inspection it turns out that the study itself was sexed up.

It’s not that the numbers are false. But the study's newsworthiness depends not only on whether a doubling of sexual material occurred, but on when it occurred. If the latest headlines stirred a sense of deja vu, it’s because KFF reported essentially the same findings in 2002. “Erotica Runs Rampant” was the Christian Science Monitor headline reporting Kaiser’s 2002 study. Yet instead of comparing the 2005 findings to these results, KFF again based its comparisons on the 1998 totals, even though most of the increase in sexual material had already occurred before 2002.

The result was to double dip the media’s attention to the old findings. The real story is far less dramatic -- TV sex is up somewhat since 2002, but the rate of increase is declining. For example, KFF’s press release is headlined “Number of Sexual Scenes on TV Nearly Double [sic] Since 1998.” It states that in a composite week of TV, “scenes with sexual content went from 1930 to 3780 scenes... totaling a 96% increase since 1998.” But it fails to mention that the heavily publicized 2002 study found 2992 sexual scenes, a 55% increase over 1998. The increase since then was less than half as great -- just 26%.

Of course, the trend was upward both times, and it may seem like quibbling to fault KFF for reporting an increase of 96% instead of 26%. But it’s troubling that the same statistical inflation did not take place when the findings went against the story line of a sexed-up TV environment. Below the lead headline on the press release is a second which reads, “Rate of ‘Safer Sex’ Messages, Up from ‘98, Has Now Leveled Off.” The release notes that 15% of shows with sexual content include reference to sexual risks or responsibilities – up from 9% in 1998, but approximately the same rate as in 2002 (15%). Similarly, in shows dealing with sexual intercourse, the rate is described as “nearly double” that of 1998, but “approximately the same as in 2002.”

Thus, the release (and the report) emphasize the very distinction in discussing increasing messages of restraint that it ignore for findings of increased sexual material. The problem with treating these comparisons differently is that they combine to strengthen the message that TV is heating up. Yet you could reverse the procedure to produce the opposite message: “Rate of ‘Safer Sex’ Messages Nearly Double Since 1998.” “Increase in Sexual Scenes on TV Up from ‘98, Has Now Leveled Off.” Both versions are true, and both are misleading.

The version that KFF did choose isn’t only in the press release; it leads the Website description of the study as well as the executive summary. And one other ambiguity of the study is alsohidden in plain view. The press release states that not three but “four studies [have been] conducted since 1998.” But the entire analysis proceeds by comparing only three points in time.

The report states that another study was done in 2000, but it was left out of the analysis to reduce the number of statistical tests, which could lead to spurious findings of statistical significance. But this would seem a small price to pay to learn whether this picture of a consistent increase in sexual material over seven years is reinforced by the 2000 findings. Wouldn’t it be better to exercise caution in interpreting the findings than to throw out one quarter of the data? Of course, the researchers know the answer to this question; why not share it with the public?

All this is not to say that there was any intention to mislead. This is a carefully done study by respected researchers; the problems are not in the research itself so much as in its public presentation. It is all too easy for anyone to interpret data in ways that emphasize the results they expect or seek to find, just as it is easy for reporters on deadline to accept the conclusions of a scientific study without taking time to analyze them carefully.