Every year that I teach “Quantitative Analysis in Political Science,” I work with the students to design and field a poll of Bowdoin students. The exercise is meant to give students exposure to the challenges of question wording and survey analysis. We usually ask standard “feeling thermometer” questions of certain Bowdoin administrators or institutions, like Clayton Rose. The analysis of the data is always fun. For example, we field the poll through Bowdoin email (using a random sample of 500 Bowdoin student email accounts), and this allows us to get pretty good response rates. (I should note, though, that even this approach is getting challenging, as many offices on campus now survey students with emailed survey links. This is making response rates decline a bit over time, not unlike the long-standing issues of response rates for telephone polls of voters.)
One interesting analysis is to look at the mean approval of Rose (on a scale of 0 to 100, with 100 being the highest approval rating) over the dates of completed interviews. That’s below, with differing plotted circles depending on how many students complete the survey on that day. You can easily see that responses are most frequent after the initial email solicitation, and they spike a little after a few email reminders from me.
What does the trend line suggest? Not much, though perhaps slightly higher approval of Rose over the month of March in 2017. Not much happened last March to suggest any reason for this, though. I would suggest, though, that on the random days when only a few people complete the surveys, that those students would seem to have a slightly higher approval of Rose than the bulk of other respondents. That’s worth thinking about a bit.
The 2018 “Polar Poll”, as I call it, is in the field now. More to come on those results soon….