Join 3,368 readers in helping fund MetaFilter (Hide)


If a tree falls in the woods, and no one is around to report it...
May 24, 2012 3:59 PM   Subscribe

How do studies measure unreported phenomena? For example, this says 95% of campus rapes are unreported. This says that nearly half of all HIV-positive men don't know they are infected. So.. how do the researchers know?
posted by desjardins to Grab Bag (10 answers total) 8 users marked this as a favorite
 
I don't know how these studies were designed, but it's not hard to imagine how they could have been.

For example, in the rape study, presumably unreported means "unreported to the police." So the researchers could survey a bunch of people, ask them if they had been raped, and then ask if they reported the rape to the police.

Similarly, in the HIV study, the researchers could ask a bunch of men about their HIV status prior to testing them for HIV. If half of the HIV-positive men had said they did not know their status or believed that they were not infected, then there's your result.
posted by jedicus at 4:05 PM on May 24, 2012 [5 favorites]


the first statistic, I'd assume is from anonymous surveys, asking whether a woman has been raped and whether she reported it to police (ie, "unreported" pertains to actually reporting it to authorities).
posted by changeling at 4:07 PM on May 24, 2012


There are a lot of ways to get at this, and nothing really official or usual.

So, for example, you want to know how many hours people work illegally? You can't really ask them. So instead you have 1/2 of your sample tell you how many hours a week they watch TV and the other 1/2 tell you how many hours they watch TV and work illegally - SUMMED.
Then you subtract the 2nd 1/2's answer from the mean of the first half.

Or you make estimates based on what you do know.
posted by k8t at 4:09 PM on May 24, 2012


You can dig a little in the links from the ACLU report and learn a bit about their methodology. I started by reading this article about the Clery Act
The Clery Act requires some 7,500 colleges and universities — nearly 4,000 of which are four-year public and private institutions — to disclose statistics about crime on or near their campuses in annual security reports.
Then they explain the issue
critics point out that the huge percentage of schools reporting no incidents whatsoever indicates a serious problem with Clery data collection. In 2006, in fact, 3,068 two- and four-year colleges and universities — 77 percent — reported zero sexual offenses. Another 501 reported just one or two.
and while this article doesn't specifically explain the numbers they did gather this page on the same website explains their methodology
The Center for Public Integrity conducted a survey of on-campus and off-campus crisis clinics and programs that service students, faculty, and staff at four-year public universities ... Of the 260 clinics and programs in the sample, 152 completed the survey for a 58 percent response rate. Respondents were asked, among many questions, how many student sexual assault cases they serviced in the past year.

To compare their answers to official numbers, the Center analyzed Education Department university crime data, which campuses are required to report under the Clery Act. The Center acquired its copy of the data from the National Institute for Computer-Assisted Reporting.

For the analysis, the Center computed a five-year average of sexual assaults for universities whose on-campus and nearby off-campus clinics and programs responded to the survey. The years 2002-2006 were used as these represented the most recent final numbers. Universities can change their initial reports as they learn more about crime on their campuses, meaning the 2007-2008 numbers were still subject to change when the analysis was done. A five-year average was used to smooth out any years with exceptionally high or low incidents. Finally, those averages were compared to the most recent numbers reported by the clinics and programs that service those campuses.
posted by jessamyn at 4:10 PM on May 24, 2012 [5 favorites]


For the campus rape stats, you survey people anonymously. You attempt to conduct a good survey with a high-quality random sample, and you promise complete anonymity. Ask people if they have been raped on campus, perhaps including a definition of rape, and ask if they reported it to law enforcement. You could get anecdotal feedback from the campus counseling service/women's center/local crisis center/emergency rooms/etc... to see that your results are in the right ballpark. Universities survey their own students and communities all the darn time, so much so that they tend to have an Office of Institutional Research that maintains a master calendar of surveys and coordinates all the groups that want to conduct surveys.

The HIV one could be done a couple ways. The exact methodology would be discussed in the study, but is usually glossed over in news articles in the popular press. One approach is that we ultimately know exactly how prevalent HIV is because patients will eventually get sick and can be diagnosed. In the worst case, we'll find out about people's HIV status from the medical examiner postmortem. HIV and AIDS are reportable diseases in most if not all of the US, so we have a very good idea how many men have been reported to health officials as HIV-positive. We can then backtrack and estimate how long it's likely been since infection based on typical progressions of HIV, which would let us calculate the percentage of HIV-positive men who don't know they are infected.

There are other ways to do the HIV study too. You could go out and grab a good random sample of men and force them to give you some blood. Test it and ask the men whether they knew they were seropositive. There are obviously ethical problems with this approach, but you could conduct a decent study that provided for informed consent and appropriate counseling. Many health care providers encourage HIV testing on a routine basis for sexually active patients, so you could get some data from there (keeping in mind that not everyone sees a doctor or agrees to be tested). South Africa tested 80,000 students in schools before cancelling the program; the aggregate results of those tests would help determine the answer to this question, at least for that population.

We also have decent data on HIV transmission rates through various means. You could survey people to ask about their sexual practices, use of shared needles, etc... and plug that information into an epidemiological model of the disease. Add in our knowledge about how many people actually have HIV and testing behavior, and you can see the number of HIV+ untested men at any given time. In other words, we can figure out the answer from the growth rate of HIV+ individuals, which we do know, even though we don't know precisely who has HIV but doesn't know it yet.
posted by zachlipton at 4:43 PM on May 24, 2012 [1 favorite]


Asking people whether they were assaulted, then asking those that were if they reported it is flawed, because respondents will err on the side of “of course I did!” when it's a bit ambiguous and the report might have just been some oral testimony that ended up discarded. They'll err in that direction because “police report and a conviction or it didn't happen” is a popular reaction to allegations that these things are taking place in a community.

The center for public integrity's methodology quoted above doesn't have that problem, because it measures reports at both ends and correlates both measures (you still need a survey sample representative of the population that feeds into the police reports but that's par for the course).
posted by Tobu at 6:14 PM on May 24, 2012


I am familiar with the HIV study cited there. It was conducted in 21 cities in the US. They sampled men who have sex with men, asked them if they knew their HIV status, and gave them an HIV test (along with other questions). They were able to compare knowledge about status with actual status, and found that of the people who tested positive, 44% were unaware that they were positive (with variation by demographic subgroup). They extrapolate from that to all men who have sex with men.

Please note, there is no "forcing" of people to give blood here. People are recruited into the study, complete an informed consent, and are usually reimbursed for their time. Also, this is not a new methodology, and is part of the Centers for Disease Control and Prevention's National Behavioral Surveillance system.

Another article.
posted by gingerbeer at 8:04 PM on May 24, 2012 [2 favorites]


Asking people whether they were assaulted, then asking those that were if they reported it is flawed, because respondents will err on the side of “of course I did!” when it's a bit ambiguous and the report might have just been some oral testimony that ended up discarded. They'll err in that direction because “police report and a conviction or it didn't happen” is a popular reaction to allegations that these things are taking place in a community.

Well, it depends on what exactly you want to know. The Center for Public Integrity study wouldn't include cases in which students felt that they were raped or sexually assaulted but didn't seek direct and immediate services at a health care facility or crisis center. A survey could ask for more details on what kind of report was made if that is useful information. Ultimately, all of these methods produce interesting results, but the trick is figuring out what questions you precisely want to answer and then explaining your results in a way that clearly articulates exactly what you found.

In this case, the Center for Public Integrity was looking at the comprehensiveness of Clery Act reports. For their purposes, they wanted to compare the number of publicly reported police investigations to a broader metric that includes many (but not all) such crimes. Their methodology makes a lot of sense for such an investigation. It doesn't really say that 95% of campus rapes go unreported; it says that 95% of rapes and sexual assaults of students who sought services from on or off-campus clinics and programs aren't reported in campus' Clery Act reports.

This methodology doesn't particularly help us answer other questions though, because it neither examines the total number of student rapes/sexual assaults nor looks into whether and how students attempted to make a report. Maybe 100% of victims made reports to the police and 95% were found to be unfounded (unlikely story). Or maybe 100% of victims made reports to the police, many were validated, but the police/university cooked the stats for the public report. Maybe victims made initial inquiries with the police but never filed a formal police report. Maybe they filed reports with other law enforcement agencies and the data wasn't included in the campus report. There's also a difference between a student who receives counseling at a campus mental health center for a rape or sexual assault and one who meets the legal definition of a victim of a campus rape/sexual assault. The crime might have occurred in the past or in another city, or a student might want counseling about something they are confused or upset about that doesn't rise to the level of a crime.

Ultimately, you'd need a different study to fully analyze the crime->treatment/counseling->law enforcement contact->formal police report->police investigation->arrest->Clery Act report pipeline if that's what you're interested in. The danger comes when you or the media report a study that says one very specific thing as one that says something much broader, even though the distinction may seem subtle. Real researchers tend to be extremely careful about their conclusions, but breathless news reports aren't so subtle.

Indeed, in light of your “police report and a conviction or it didn't happen” comment, perhaps the most interesting data would be to compare the results of a survey with the number of publicly reported crimes in Clery Act reports.
posted by zachlipton at 9:41 PM on May 24, 2012


In addition to the kind of studies mentioned above, HIV stats are also based on the fact that 90% of HIV positive folks will, if untreated, eventually develop symptoms. So, we can look at how many people each year find out their HIV status when they are hospitalized with their first opportunistic infection. It takes 5 - 10 years after contracting HIV to develop an opportunistic infection. Thus, we can use the "discovers HIV at hospitalization" number to estimate how many people are living with HIV, but do not know their HIv status.
posted by hworth at 7:59 AM on May 25, 2012


In other words, estimates and projections.
posted by tacodave at 3:54 PM on May 25, 2012


« Older Do we need a lawyer to deal wi...   |  I really like the music on the... Newer »
This thread is closed to new comments.