A group of three researchers with ties to FSU (David Hedlund, Ph.D, Joseph St. Germain, Ph.D., and Marlon McPhatter) released the results of its "Florida State University Seminole Logo Public Opinion Research" study. Simply put, they asked the fans if they liked the new logo.
The resounding answer? No. The fans, or the 6,872 responses to the survey, overwhelmingly prefer the old Seminole logo. This is, of course, not surprising given the enormous backlash that occurred when the first images of the logo leaked.
What's most surprising to me about the study is that the older Seminole fans were most approving of the new logo. That seems to go against the very logic given to us by the Florida State administration; they told us, oh so long ago, that the logo was being "modernized" to appeal to the younger fans. As it turns out, it's the older generation that likes the logo more than the younger generation. In discussions I've held with fans, I've found that this is for a reason; the older generation is less emotional and more rational. I'm sorry, but it's just a function of age. And with age, comes wisdom that all things change at one point or another. The older generation was also more receptive to the logo change because they trusted the administration more.
With that said, the study yielded more questions than answers and I went straight to the source. Fortunately, one of the authors of the study, Joseph St. Germain, Ph.D., was kind enough to sit down and respond to my questions. The following is our exchange and some insight into the study.
First, who paid for the study?
This study was done pro bono by David Hedlund, Ph.D. (professor at St. John’s University and FSU alum), Marlon McPhatter (current doctoral candidate at FSU) and myself (Vice President of Kerr & Downs Research and FSU alum) on our own time.
Second, what's the goal of the study? Are you hoping to convince the Florida State administration that they should reconsider the logo change?
Other than a survey conducted on Warchant and people expressing themselves on social media (which is not always a great indicator of what everyone thinks), we weren’t sure what people, on the whole, thought about the new logo. Our goal was to better understand how FSU students, alumni and public at-large felt about the original, new, and Jodi Slade’s logo. We’re not trying to change anything. Our plan from the beginning was to conduct this study and then release the results to the public.
Third, is the opinion scale too vague? How can we meaningfully distinguish between, for instance, the "hate a lot" and "hate a good amount" or "hate some" and "hate a little" categories?
These questions were presented as a Likert-type 1-10 scale. For example, Q2 of the survey reads as follows: “How do you feel about Logo #1? Please rate Logo #1 on a scale of 010, where 10 is "I love Logo #1" and 0 is "I hate Logo #1." Using question types and scales similar to the preceding example are standard in the research industry.
[Note: this still seems odd to me. I personally think the questions should be something simpler, along the lines of "How willing are you to purchase merchandise with the new logo?" with potential responses along the lines of "less willing," "about the same," or "more willing." Granted, I don't do research for a living, but I personally think the survey would be more meaningful especially if you're trying to prove a point to FSU's administration.]
Fourth, did anything about the study surprise you?
I didn’t have any preconceived notions about what the results would look like. It was good to see that individuals plan to attend the same or slightly more FSU events inside and outside of Tallahassee in the next 12 months, as attendance at sporting events is trending down in most markets.
Finally, do you think 6,872 responses represent a meaningful sample and how do you explain the nearly 4,000 individuals who agreed to take the survey but did not begin to take it?
More than 6,800 responses is a meaningful sample size. I was very happy to see that many responses. Additionally, the demographic information (age, gender, income, etc.) aligns closely with what one would expect a sample of students and alumni to have.
11,271 opened the survey and we ended up analyzing 6,872 responses. That means more than 60% of all individuals who visited the survey page ended up completing the survey. That’s a great percentage in survey research. As far as the other 4,000 individuals, I can’t know for sure why they didn’t take survey. However, based on the communications we received from some people and based on our experiences conducting surveys, the following might be reasons they did not take the survey:
1) They wanted to see what the research was about and didn’t think the survey pertained to them
2) They thought the survey was about the use of Native American logos in general
3) They wanted to see what the research was about and the survey looked like it would take more time than they are willing to commit to taking a survey
Comments? Questions? Kudos?