Original Research
June 25, 2018 5:59 PM Subscribe
I have created two pieces of new research in the form of (1) a survey and (2) a database. The survey was 30 questions on Google Survey which were answered by about 30 respondents in and around public radio. That was all that would respond. The database was created by coding terms, counting and categorizing their occurring instances and then running cross-tabulations to get simple percentages and ratios. In that case, the pledge drive language of 45 station was analyzed and yielded about 1980 datapoints.
Neither of these were rigorously "scientific" and would likely be considered "for entertainment purposes" by professionals surveyors. I used PSPP on the database. Nonetheless, they do provide me with a very rough look into what I was looking at, which were (a) people associated with an industry giving me their feelings about aspects of that industry, and (b) the frequency of use of terms that customers (listeners) of that industry recognize.
So when including this information in a bibliography, how do I explain how I went about coming up these results? Do I have even have to? And if I do, does calling them "unscientific" delegitimize them or does sharing their limitations upfront make readers more aware of those limitations? If it helps, I can include the Google Survey unless the industrious among already know how to get it.
Neither of these were rigorously "scientific" and would likely be considered "for entertainment purposes" by professionals surveyors. I used PSPP on the database. Nonetheless, they do provide me with a very rough look into what I was looking at, which were (a) people associated with an industry giving me their feelings about aspects of that industry, and (b) the frequency of use of terms that customers (listeners) of that industry recognize.
So when including this information in a bibliography, how do I explain how I went about coming up these results? Do I have even have to? And if I do, does calling them "unscientific" delegitimize them or does sharing their limitations upfront make readers more aware of those limitations? If it helps, I can include the Google Survey unless the industrious among already know how to get it.
Rather than say they were unscientific I would simply describe the sample size, number of questions asked (maybe also include the questions themsleves as an appendice to the main research?) and explain what you did much as you have done here - doing this will give the reader a much clearer picture of what you did than simply labeling it as 'unscientific'. IMHO it's more helpful to give the reader the option of finding out how you conducted your research than not.
Also, and I'm not sure if you are aware of this, and apologies if so, but what you have done with the data is not a million miles away from Grounded Theory Data Analysis, which is a common social sciences methodology, and one which several of my research students have used successfully for similar data sets in PhD projects.
posted by Chairboy at 1:06 AM on June 26, 2018
Also, and I'm not sure if you are aware of this, and apologies if so, but what you have done with the data is not a million miles away from Grounded Theory Data Analysis, which is a common social sciences methodology, and one which several of my research students have used successfully for similar data sets in PhD projects.
posted by Chairboy at 1:06 AM on June 26, 2018
Response by poster: Thank you both. Very helpful. The work is in Metafilter Projects - PLEDGE: The Public Radio Fund Drive.
posted by CollectiveMind at 10:12 AM on June 26, 2018 [1 favorite]
posted by CollectiveMind at 10:12 AM on June 26, 2018 [1 favorite]
« Older Best way to get from Laguardia to Brooklyn? | How do you figure out spending for kid with... Newer »
This thread is closed to new comments.
More detail would depend what context you're citing it in and what format you'd use for published works in that context.
posted by gideonfrog at 6:39 PM on June 25, 2018 [2 favorites]