How We Do It, And Other CIRP Secrets
“How do you figure out what is important to report in the Freshman Monograph each year?”
This was a question posed by one of the reporters I talked with at the Education Writers Association conference held at UCLA this past weekend. I figured that if it was interesting for a reporter from one of the major media outlets in the country, maybe it would be worth a blog. So, here it is.
It’s mid September. We are watching schools return completed questionnaires, and watching the ones that have not yet done so even more closely. There are always a few schools where there was some event (a staff transition, a hurricane, an office move) and the completed questionnaires were lost in the shuffle. By the time we get all the responses in, it is the end of October.
We give ourselves 10 days to process the results – it gets really busy towards the end of the month. At the same time we have been getting schools to complete forms telling us, in part, how many first-time full-time students were on campus this fall. And last, we get the data file with all the responses. Then the fun begins.
The first thing we do is look at what we’ve got. This means we compare how many completed questionnaires we have from a school and compare that to how many first-time full-time students entered that institution this fall. This gives us the participation rate for schools, that is, what percentage of the total do we have responses from? Because when we create what we affectionately call “the norms,” we want to make sure that we have a representative sample of the students entering all types of four-year colleges. So we only take schools with a large percentage of their first-time full-time students in the returns: 65%.
Then we weight the data. For 2011, this means we take the 203,968 students we have from our “norms” institutions (remember, those with high enough participation rates) and statistically make them look like the 1.5 million students that entered college for the first time as full-time students in four year schools this fall. The weight is a two-stage process. First we take into account the sex of the student, and then institution type. For instance, we would weight a woman at a medium selectivity private university with a 3.58. What does weight mean? It means for every one woman at that particular type of school who answered the survey, in our database of 203,968 respondents, we count her as 3.58 women in the “weighted” dataset of 1,448,051 students. Serge Tran, our associate director for data management and analysis, is hard at work at this point running numbers and checking figures.
Now we have our data ready to go!
Next Monday the CIRP staff will meet and look at two documents. One will provide the responses for each question on the 2011 CIRP Freshman Survey. The other will summarize results of those questions going all the way back to 1966, the first year of the survey. This last document helps us to see the changes that took place in the past year (Serge will have added a column at the end that depicts the change from last year to this year). Any of the responses from the previous year that have changed by about two or three percentage points or more will be flagged, especially if they have been trending in that direction for a few years. Then we look for patterns. Here’s an example from last year.
One of the first things that stood out with the 2010 results was that self-rated emotion health had taken a dive to the tune of 3.4 percentage points! Not only that, but when we looked at the larger trend, this was the lowest that this particular rating had been in the 25 years that we have been asking about it.
Then we noted that reports of feeling frequently “overwhelmed by all I had to do” had increased two percentage points. When we saw that we had several changes that related to stress and emotional health, we knew that we had some interesting results to discuss. True enough, when the report was released in January of 2011, it was front page news.
So, long story a little less long. It starts with people at colleges and universities all across the country who understand the importance of finding out about their first-year students. Then the cooperation of hundreds of thousands of college students who are sharing their hopes and dreams about the next four years with us. Finally, a dedicated team pours over every result with the goal of improving the understanding of the college experience.
And that’s how we do it.