Posted by Laura Palucki Blake on February 1st, 2013 in News, News Homepage, Surveys, Uncategorized | Comments Off
I was happy to see so much attention in the twitterverse regarding the 2012 Freshman Survey–in particular the finding that more than ever before, students are going to college to get good jobs and make more money. One of the tweets from #AACU13 that stuck with me over the past week is a tweet from Ellen Schendel (@LNSchen):
“Most CAO’s report using assessment tools (NSSE, CLA, etc) but only 20% believe they are making good use of the data #aacu13”
Now that you have your CIRP survey findings-both at the national level and for your institution, how will you use them? Are there things that help your campus make better use of survey data and things that should be avoided? Here are five that come up often. If you have more, we would love to hear them. Please add them in the comments section below.
1) Too much demographic information. I’m in no way saying basic information about the survey is unimportant, I know it’s critical context for the results, but there is no faster way to kill interest in what you have to say than by starting off a presentation with how many students the survey was sent to. Put basic demographics, response rates, information about timing of the survey, incentives, etc. online or in an appendix, making certain to point out any important caveats that should influence discussion of the results. Let it be known where to find that information, and keep it up to date.
2) Overly complex statistical explanations. It is your responsibility to conduct statistically robust analysis, and your campus is counting on you for accurate and reliable data, but the details of how you limited the dataset, weighted the responses, dealt with missing cases, etc. are not the story here, the results are. Your campus wants to know what the survey reveals about where the institution is doing well and where it can improve. Your job is to provide results and to facilitate discussion that helps shape plans about how to improve. Lead with the results, and what they reveal about your students.
3) A dissection of every data point, in survey order. I work in surveys, so I think every question is important and worthy of discussion. It took me way too long to realize other people on campus did not share this view. Presenting survey results is not just about sharing numbers, percentages and significance levels. Part of our responsibility is to turn survey results into information the campus can use, and that starts with making informed choices about what to present and what not to. If a percentage has not moved in years, it might be time to stop reporting on it and move on.
4) Failing to connect data to issues on campus. What issues do you know will capture interest on campus? If the entire campus is talking about campus climate and diversity, don’t present findings on intended majors. Lead with what the campus is interested in talking about. Good presentations tell a story with data. The campus can use data to understand who their students are, what they do, and who they will become. Approach your survey results looking for the stories you can tell. If you think of reporting your results in terms of taking several items and connecting them together to make meaning, that not only will involve the campus in a deeper discussion, hopefully it will get them coming to you and asking what other stories can be told with the data you have.
5) Be a facilitator, not a gatekeeper. I visit a lot of campuses, and I frequently hear faculty and administrators say they did not know they had data on a topic, or talk about how difficult it is to get results from the office that “owns” the data. View the data you are providing as a foundation for talking about students—their experiences, their needs, their opportunities. It’s really just the start of the process of improvement. Share the results across campus with anyone who might be connected or interested in the issue or topic. Answer their follow-up questions, connect interested faculty, departments, or offices over email and keep talking about what the result might mean, what other information they might want or need to better understand the issue, and what actions they might want to take as a result.