August Poll of the Month: Using Incentives to Increase Response Rates

Posted by Kevin Eagan on September 11th, 2014 in News, News Homepage | Comments Off

Our August poll of the month asked about ways in which researchers incentivize survey participation. Survey research has demonstrated that some incentive is better than no incentive, and other studies have investigated how well certain types of incentives work.

The August poll had 83 respondents, and 60% of those who answered the poll indicated that they utilized a guaranteed incentive to increase response rates. These guaranteed incentives might be small gift cards ($5, $10), cash, or even candy.

We had one campus this past spring that used candy bars as an incentive to encourage participation in the College Senior Survey and the Your First College Year survey, and response rates for both administrations exceeded 67%! It just goes to show that providing incentives does not have to be expensive if the right marketing approach is implemented. Other campuses promise additional tickets to graduation for their senior surveys, which can be a particularly effective approach for venues with quite limited capacity.

Raffles represented the second most popular incentive used by respondents, with 40% indicating that promising entry to a raffle or drawing was once strategy they used to increase response rates. With raffles, the trick is to find the right prize that fits within the survey administration budget and piques participants’ interest. Popular raffles we see our campuses using include parking passes, iPads, and movie tickets. One strategy for survey administrators considering a raffle as an incentive is to raffle a prize every week – the earlier potential participants complete the survey, the better their odds are of winning the prize.

More than a third (36%) of respondents to the poll reported using coupons/discounts for campus or local vendors, and this approach represents an opportunity to create partnerships on and off campus. Local businesses are often looking for ways to attract students, faculty, and staff to their establishments, and this could be an effective win-win strategy for these businesses and survey administrators Likewise, surveys often have multiple stakeholders on campus, and leveraging other units’ interests in the data into tangible incentives can help create buy-in while increasing response rates.

One in ten respondents use food as an incentive, and we often see this approach used in residence halls. Resident Assistants (RAs) might host a survey pizza party, or perhaps survey researchers will host a lunch for faculty and ask them to complete the survey. Such an approach can both increase response rates and create a sense of community of campus.

We like to have fun with our monthly polls, but, to be clear, researchers have spent a great deal of time and money to understand the effectiveness of incentives at increasing response rates. For those interested, Sanchez-Fernandez et al. (2010) provide a nice overview of studies that have focused on guaranteed incentives (both pre and post), and in their own analyses they find that providing a guaranteed incentive significantly increased response rates. Porter (2004) outlines a more holistic approach to conducting an accurate, efficient, an d high-quality survey.

We hope you’ll take a couple of minutes to complete September’s poll of the month, which asks about survey marketing strategies. Check it out on our main page: www.heri.ucla.edu.

Putting Today's USNWR Rankings into Context

Posted by Kevin Eagan on September 9th, 2014 in News, News Homepage | Comments Off

Undoubtedly a number of presidents, provosts, directors of admissions, and others at elite colleges and universities are either breathing sighs of relief or wringing their hands this morning with the release of the 30th edition of the U.S. News and World Reports Best College Rankings. As has been the case since the 2011-2012 academic year, Princeton University held the top spot among national universities, with Williams atop the list for national liberal arts colleges.

For all of the hype that the media (e.g., here, here, or here) give to the rankings, a little perspective is in order. The top 20 institutions on the university list collectively enroll slightly more than 150,000 students, which is less than 1.5% of the more than 10.5 million undergraduate students enrolled in four-year degree-granting institutions nationwide. And the lists published today make no mention of the more than 1,500 community colleges currently serving more than 7 million undergraduates nationwide. Indeed, these two-year institutions are educating the most racially and economically diverse cross-sections of college students.

Data from the CIRP Freshman Survey would suggest that the attention given to the rankings each year (and, yes, it’s clear that perhaps even writing this blog is contributing to the hype) is unwarranted, as national rankings appear to matter little in students’ college choice process. When we break out the data by selectivity, we find that rankings in national magazines only matter for those students at the most selective campuses. Just less than a quarter (24%) of students at the most selective campuses indicated that rankings in national magazines was a “very important” factor in their decision to enroll at their current institution. By contrast, 10% and 11% of students attending institutions categorized as “low selectivity” or “medium selectivity” rated rankings as a “very important” factor in their decision process.

Instead, cost and financial aid are increasingly being considered as top factors in students’ college choice process, as we reported in the 2013 Freshman Survey monograph.

Given the lack of importance students place upon rankings in deciding where to attend college and the amount of time institutional research officers put into filling out the annual USNWR survey, what’s the point? Perhaps the rankings help those at the top attract greater numbers of donors and larger gifts, which in turn facilitate their ability to rest atop these lists. But they also encourage poor behavior (a.k.a., cheating, lying) among some colleges and universities – what an example to set for students!

A perennial critique of the USNWR rankings is that the lists focus less on outcomes, although the methodology does include measures of first-year retention and six-year graduation rates. And this year, the magazine added student loan default rates and campus crime, though the editor of the magazine downplayed the latter statistic and suggested that readers should not place much stock in campus crime rates.

The Obama administration is proposing a rating system of its own that focuses more on outcomes, and the details of this proposal are expected to be released later this fall. HERI Director Sylvia Hurtado joined a panel, hosted by UCLA’s Civil Rights Project, on Capitol Hill last week to advocate for fairness in the measures used. Sylvia presented CIRP data highlighting the importance of using input-adjusted measures to account for differentials in student preparation and campus resources when measuring outcomes like retention and degree completion. When we look at the distribution of campuses that do better than expected given the students they serve and the resources at their disposable, we tend to see a very different ranked list.

CIRP Surveys and Accreditation: SACS Guide Updated for 2014

Posted by Ellen Stolzenberg on September 2nd, 2014 in News, News Homepage | No Comments »

Accreditation remains a driving force behind survey use on college campuses. Because CIRP surveys are comprehensive and designed to be longitudinal, they cover a wide variety of outcomes related to student growth and development. CIRP survey data can be used throughout the accreditation process—to engage in institutional self-study, to inform a visit by evaluators, and to respond to a decision handed down by a regional accreditor. The CIRP Accreditation Guides are designed to facilitate using the surveys in the accreditation process.

Each institution approaches accreditation differently—taking into account its mission, goals, programs, policies, and the composition of the faculty and student bodies. One shared element, however, is to understand how the practices and data already available on campus align with accreditation standards. The CIRP Accreditation Guides demonstrate how items from all 5 CIRP surveys (TFS, YFCY, DLE,CSS, and FAC) connect to standards for each of the regional accrediting bodies. For those that are considering future survey participation, sample timelines help institutions decide when and how often to gather evidence for use in accreditation.

We are in the process of updating all of the guides to correspond to the 2014 CIRP Survey instruments and the updated regional accreditation standards. The Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) has updated their accreditation standards in and the 2014 CIRP Accreditation Guide for SACS reflects those changes, in relation to the 2014 survey instruments.

The WSCUC (formerly WASC) guide was updated in May 2014 and the guides for the other accrediting bodies will be updated in the coming months. Previous versions of these guides are available now.

Higher Learning Commission-North Central Association (HLC-NCA)
Middle States Commission on Higher Education (MSCHE)
New England Association of Schools and Colleges (NEASC)
Northwest Commission on Colleges and Universities (NWCCU)
WASC Senior College and University Commission (WSCUC) (Updated May 2014)

Please visit our accreditation webpage, where we highlight examples of how institutions have used CIRP surveys in their accreditation efforts.

The Road from Serres: A Feminist Odyssey by Helen Stavridou Astin

Posted by Ellen Stolzenberg on August 27th, 2014 in News, News Homepage | Comments Off

“You have to go there. The Astins are there!”

This was how I first became aware of Lena and Sandy Astin. I volunteered as an orientation counselor for incoming students at the start of my senior year at Tulane University in 1996. When I expressed an interest in working with college students, the director of the orientation program suggested I look into the emerging field of student affairs. When I told her of a new MA program at UCLA, she said “You have to go there. The Astins are there.”

She was right. I had the opportunity to take a student development class with Lena during the 97-98 school year. I was immediately captivated, simultaneously drawn in by her warmth and intimidated by her intellect and presence. The intimidation immediately became admiration and respect. I learned so much, both in and out of the classroom. To be perfectly honest, it was the lowest grade I received in graduate school, but it filled me with confidence and the desire to go beyond the master’s degree.

In The Road from Serres: A Feminist Odyssey, Lena (to those who know her) tells her incredible story: from the tumultuous nature of growing up in German-occupied Greece during World War II and the leap of faith she took in pursuing a scholarship to complete her undergraduate degree in the United States to her pioneering doctoral studies at the University of Maryland and groundbreaking career.

It was at the University of Maryland that she met fellow graduate student, Alexander (Sandy) Astin, her husband of 58 years. Lena shared the challenges and triumphs of both her career and her family, as a proud mom of two boys and grandmother to three beautiful granddaughters. Moving the family and their research to Los Angeles (and UCLA) in the 1970s is a definite turning point. As the co-founder of the Cooperative Institutional Research Program (CIRP) and now Senior Scholar and Distinguished Professor Emeritus of Education, her legacy and connection to CIRP and UCLA continue.

If you know Lena, this memoir reads like an audio book. You can hear her telling all of these stories, some of which she readily shared with her students. Even if you haven’t met her, her spirit is evident in this memoir. Her family is first and foremost in her life. The common thread throughout the book and the essence of Lena as a mentor, professor, and scholar is love. She is the epitome of strength, determination and caring, qualities that weren’t (and often still aren’t) necessarily viewed as assets in academia.

When I returned for the PhD in Higher Education and Organizational Change in 2001, Lena and Sandy were beginning to retire. (Though, knowing them, they will never completely retire.) During the 2 ½ years I worked at HERI during my doctoral program, Lena was always willing to give advice or lend an ear, whether it was about CIRP, my dissertation, or most importantly, my family.

I’ve never shared this with her, but the way she speaks of, and interacts with, her family reminds me very much of my grandmother, who passed away the first week of my doctoral program. One of my favorite Lena moments occurred at the CIRP 40th Anniversary, at which Lena and Sandy were honored by world-renowned scholars in higher education from UCLA and elsewhere. I happened to be sitting behind Lena when her young granddaughters ran in. Her face lit up and the fact that dozens of higher education leaders were there to honor her was irrelevant because her little girls were there. I will never forget that look of pure joy.

The Road from Serres: A Feminist Odyssey provides great insight into Lena’s past and present. She is honest and unfiltered. My admiration continues.

I returned to HERI as the Assistant Director for CIRP nearly six months ago. Lena was one of the first to offer her congratulations.

Results from the July Poll of the Month: Mode of Survey Administration

Posted by Kevin Eagan on August 4th, 2014 in News, News Homepage, Surveys | Comments Off

In HERI’s first-ever Poll of the Month in July, we asked folks about their preferred method of survey administration. We provided three options: paper only, web only, or a combination of both web and paper. In total, we had 59 respondents to our little poll, and the distribution of responses was as follows:

15% reported only using paper surveys
36% rely only on web administrations
Nearly half (49%) incorporate a combination of web and survey administrations into their cycle

For most of our surveys at the Cooperative Institutional Research Program (CIRP), we offer both web and paper options (this past year was an obvious exception for YFCY and CSS). Paper surveys, when administered to a captive audience (i.e., in a proctored setting), represent one of the best methods for ensuring a high response rate. We find that roughly 75% of our campuses that participate in the CIRP Freshman Survey utilize a paper administration, and the vast majority of these schools make the Freshman Norms every year (surpassing a 65% participation rate)! By contrast, just under one-third of “web-only” schools reach this same threshold for the CIRP Freshman Survey.

Administering surveys on paper to freshmen during orientation sessions or to seniors during graduation rehearsal or cap and gown distribution can certainly help to create a norming culture where participation is not only encouraged but also expected. Not all campuses, however, have the personnel resources or the time during orientation or the academic year to allow for an in-person paper administration. Web surveys provide a flexible alternative to a paper survey administration and also represent a “greener” administration option, which we’re finding many students appreciate. Two of CIRP’s surveys – the Diverse Learning Environments survey and the HERI Faculty Survey – are web only instruments, as those surveys provide opportunities for campuses to customize surveys with elective modules.

Campuses using web surveys, however, need to give strong consideration to a number of factors. How good are the email addresses students have on file with the institution? Is the site from which the emails are being sent whitelisted with the campus? Is there an incentive to encourage participation? What additional marketing tools are available to help get the word out about the survey beyond email?

With all of our web surveys, we allow campuses to upload two sets of email addresses to increase the likelihood that the email invitation for the survey reaches students. We also outline specific anti-spam instructions for reps to pass along to the information technology departments to ensure that survey invitations coming from Qualtrics (the vendor we now use for all web surveys except TFS) or DRC (our TFS vendor) do not get tied up in university spam filters. Incentives are certainly key in any survey administration but especially so in web surveys, as students (and faculty!) are bombarded with survey invitations from different units on campus as well as from marketing agencies. The poll of the month for August asks about specific kinds of incentives you use for your surveys, so take a few minutes to respond.

Additional marketing tools for web surveys are also incredibly important to consider. In the event that email invitations are getting tied up in potential respondents’ personal spam filters, having alternative strategies in place to let your sample know about the survey is key. We have seen folks rely on CIRP Infographics, post billboards on buses, and have scrolls in the dining hall as effective advertising methods. Don’t overlook the importance of marketing if choosing to do a web-only survey administration.

The combination of web and paper administration, particularly in the same survey, provides campuses with incredible flexibilities. If you start with web surveys, you can send a more targeted set of paper surveys to non-respondents. Obviously the reverse could also work effectively – following up with students who did not submit a paper survey with a targeted set of emails requesting they complete the online version of the instrument. We offer this hybrid option for the CIRP Freshman Survey, College Senior Survey, and Your First College Year survey.

Regardless of method of administration, it is important to ensure proper planning has been done and that personnel are available to assist in the administration process. Hopefully with the right combination of paper/web administration and early and often marketing, you will see those response rates inch higher! Just a reminder, you can go ahead and register for a paper, web, or hybrid administration of the 2014 CIRP Freshman Survey right now, and registration for the YFCY, CSS, and DLE will open in mid-September.

Why Are They Leaving? Understanding Retention with CIRP Data

Posted by Kevin Eagan on July 15th, 2014 in News, News Homepage, Surveys | Comments Off

We’ve been thinking quite a bit about retention at the Higher Education Research Institute this summer. We have the Retention and Persistence Institute just two weeks away (July 29-30), and we unveiled a couple of new tools in beta stage at the Association for Institutional Research (AIR) annual forum in Orlando earlier this summer. Additionally, last week’s article in Inside Higher Ed on the decline in first-year retention rates nationwide caught our attention.

The article reviewed a report released from the National Student Clearinghouse that shows first-year persistence rates (reenrollment at any institution for the second fall term) had declined 1.2 percentage points across all sectors since 2009 while the first-year retention rate (reenrollment in the second fall term at the same institution had remained virtually unchanged since 2009.

Notably, the report provides descriptive information about persistence and retention; however, the report and its associated article offer little insight about why students are not reenrolling for their second fall term – either at their native institution or at any higher education institution. Similarly, the report leaves open the question about the kinds of students who are not persisting in higher education or are not being retained at their home campuses.

These open questions prompted us at HERI to look at a few of the tools and data provided by the  Cooperative Institutional Research Program to understand better the kinds of students who are leaving higher education and the reasons for their departure.  At AIR this past May, Adriana Ruiz and I presented on a new first-year retention calculator tool that we will be including in the 2014 CIRP Freshman Survey reporting package this year. The calculator follows a similar format and line of research to the graduation rate calculator we introduced to the CIRP TFS reporting package a few years ago.

The first-year retention calculator enables campuses to estimate their expected first-year retention rate based on a set of incoming student characteristics, collected from the CIRP Freshman Survey known to predict retention. Campuses can then compare their expected rates to their actual rates to benchmark whether they are performing better (or worse) than anticipated. The most salient predictors of first-year retention in the model we built for this calculator included the extent to which students felt depressed (negative predictor), self-rated emotional health (positive predictor), having an expectation to transfer (negative predictor), and entering college with major concerns about their ability to finance their college education (negative predictor).  Notably, students who express an inclination toward transfer when taking the CIRP TFS during orientation or the first few weeks of their fall term are the ones most likely not to reenroll for the fall of their second year.

Financial aid measures (grants, parental resources positively predicted retention while relying more heavily on loans negatively predicted retention), pre-college preparation measures (higher high school GPA SAT scores and SAT scores were both positive predictors), and having chosen the particular institution based upon its cost of attendance (positive predictor) also significantly and substantively predicted whether students returned to their home institution for the fall of their sophomore year. This model was built based on more than 210,000 respondents to the 2004 CIRP Freshman Survey across 356 colleges and universities. We matched students’ TFS data with enrollment and completion data from the National Student Clearinghouse.

The Cooperative Institutional Research Program’s (CIRP) Your First College Year survey has, for the last several years, included a retention module with the online version of the instrument. Although the sample size for this module tends to be small, given that most campuses opt for our paper instrument or choose to survey only those students still enrolled at the institution during the time of administration, some of the top reasons for leaving are instructive.

Among the 246 students who completed the retention module in 2011, 40.3% indicated a lack of community as being “very important” in their decision to leave. More than one-third (36.2%) noted that their financial aid package’s inadequacy was a “very important” factor in their departure decision, and 41.7% said their inability to afford college was a “very important” reason for leaving. More than 40% (40.2%) said they left in part due to their preferred major not being offered while more than one-third (35.2%) noted academic difficulties related to academic probation, suspension, or expulsion as very important reasons for leaving.

It’s clear that the reasons for leaving college are as diverse as the students who leave, but we consistently see academic and financial difficulties as key factors in students’ departure decisions. Coupled with the findings from the first-year retention calculator, it’s clear that prior preparation and financial aid/college cost will continue to have outsized roles in determining how successful campuses are in retaining students – both into the second year and through to degree completion.

HERI Poll of the Month

Posted by Kevin Eagan on July 2nd, 2014 in News, News Homepage | Comments Off

You may have noticed a new feature that we added to the web site this week: the HERI Poll of the Month. We will change the question each month and post a short blog about the previous month’s results. This is one way we hope not only to connect with our CIRP representatives, affiliated scholars, fellow researchers, and other members of the community but also to learn more about the ever-growing CIRP family.

This month’s poll asks about how you prefer to survey students (web, paper, or a combination of web and paper). Please don’t be alarmed – we have no plans to do away with paper surveys. In fact, the 2015 Your First College Year survey and the 2014-15 College Senior Survey will once again have a paper option. We are in the final negotiation phase with a vendor for these surveys, and we continue to offer a paper option for the CIRP Freshman Survey as we have for the past 49 years. (Next year is our big 50th…stayed tuned for more details!)

So take a few seconds and tell participate in the poll (it’s good for survey karma), and be sure to check back in early August for the results.

CIRP Surveys and Accreditation: WSCUC (formerly WASC) Guide Updated for 2014

Posted by Ellen Stolzenberg on May 22nd, 2014 in News, News Homepage | Comments Off

Accreditation remains a driving force behind survey use on college campuses. Because CIRP surveys are comprehensive and designed to be longitudinal, they cover a wide variety of outcomes related to student growth and development. CIRP survey data can be used throughout the accreditation process—to engage in institutional self-study, to inform a visit by evaluators, and to respond to a decision handed down by a regional accreditor. The CIRP Accreditation Guides are designed to facilitate using the surveys in the accreditation process.

Each institution approaches accreditation differently—taking into account its mission, goals, programs, policies, and the composition of the faculty and student bodies. One shared element, however, is to understand how the practices and data already available on campus align with accreditation standards. The CIRP Accreditation Guides demonstrate how items from all 5 CIRP surveys (TFS, YFCY, DLE, CSS, and FAC) connect to standards for each of the regional accrediting bodies. For those that are considering future survey participation, sample timelines help institutions decide when and how often to gather evidence for use in accreditation.

We are in the process of updating all of the guides to correspond to the 2014 CIRP Survey instruments and the updated regional accreditation standards. The Western Association of Schools and Colleges (WASC), now known as the WASC Senior College and University Commission (WSCUC), updated their accreditation standards in 2013 and the 2014 CIRP Accreditation Guide for WSCUC reflects those changes, in relation to the 2014 survey instruments.

The guides for the other accrediting bodies will be updated in the coming months. Previous versions of these guides are available now.

Higher Learning Commission-North Central Association (HLC-NCA)
Middle States Commission on Higher Education (MSCHE)
New England Association of Schools and Colleges (NEASC)
Northwest Commission on Colleges and Universities (NWCCU)
Southern Association of Schools and Colleges (SACS-COC)

Please visit our accreditation webpage, where we highlight examples of how institutions have used CIRP surveys in their accreditation efforts.

Come Learn in LA This Summer at One of Our Institutes

Posted by Kevin Eagan on May 6th, 2014 in News, News Homepage | Comments Off

In case you needed a reason to enjoy warm, sunny Southern California this summer, HERI will again be offering four two-day institutes throughout July and August. These workshops provide opportunities for researchers and practitioners to connect on issues related to diversity research, retention and persistence, innovation in undergraduate STEM education, and faculty teaching and learning. All four institutes will be held at UCLA in Westwood.

The Diversity Research Institute (DRI) will be held July 15-16, and we are excited that guest speakers Adriana Kezar and Rona Halualani will again join HERI Director Sylvia Hurtado to discuss issues of campus climate, diversity program mapping, and institutional transformation. The DRI is perfect for teams of administrators and faculty seeking best practices to assess and improve the climate for diversity on their campuses.

The Retention and Persistence Institute (RPI) will occur July 29-30. HERI Director Sylvia Hurtado and CIRP Interim Director Kevin Eagan will facilitate this workshop focused on the latest in retention and persistence research. Participants will learn how to use HERI’s popular retention calculator while also be among the first to see two new retention calculators that HERI will unveil this summer. Participants will engage in an interactive series of discussions about best practices on campus related to retaining students from diverse backgrounds, and everyone will leave with a campus-specific plan to improve retention and persistence at their institution.

The STEM Summer Institute (SSI) is our newest offering and will feature research from a 10-year study at HERI focused on undergraduate student pathways into and through STEM programs. The SSI will be held August 5-6. We encourage teams of faculty and administrators interested in transforming undergraduate STEM education on their campuses to attend this intensive two-day workshop. In addition to presentations by Drs. Hurtado and Eagan, participants will hear from experts in assessment and institutional research to highlight what can be learned from survey and administrative data. Additionally, we will have representatives from funding agencies provide their perspectives with regard to where funding for STEM educational research is heading.

Finally, we are excited to bring back the HERI Institute on Faculty Work/Life Issues, which will be offered August 12-13. After a successful launch of this workshop last summer, we have decided to extend it to be two full days. Drs. Hurtado and Eagan will again lead the workshop and offer insights from research on faculty pedagogy, stress, retention, and retirement. This year’s workshop will also feature hands-on analysis of the soon-to-be-completed 2014 HERI Faculty Survey. Participants will learn from one another regarding best practices to engage faculty in development opportunities and receive instruction on analyzing HERI Faculty Survey data. We invite researchers interested in faculty issues and administrators seeking strategies to increase faculty engagement on campus to attend.

In addition to these exciting programs, participants will have access to all that the beautiful, diverse city of Los Angeles has to offer. We hope you will consider joining us this summer to learn and network.

First Year Students at a Glance Infographic for TFS 2013

Posted by Kevin Eagan on April 9th, 2014 in News, News Homepage, Surveys | Comments Off

This entry is posted on behalf of Melissa Aragon.

2013 Freshman Survey InfographicWe are excited to highlight a new way we are helping CIRP Freshman Survey participating institutions show off their results. We are in the second year of our CIRP Freshman Survey infographic, and this time we are releasing an infographic highlighting some of the national data and providing a customizable version of the infographic for campuses to add their local findings.  This customizable version of the infographic poster enables institutions to report their own results next to the national statistics to easily compare the two figures.  In order to make this process a bit smoother, we also created a codebook to aid institutions in identifying the survey items reflected in the infographic. Please visit our infographic webpage.

Explore our “First Year Students at a Glance” infographic highlighting some of the findings from the 2013 CIRP Freshman Survey in an interesting display of icons and data.  From a longitudinal look at the increasing number of college applications students are submitting to their use of technology and insight on their political viewpoints, the infographic poster provides a visually engaging perspective of the 2013 freshman class.

We have received great feedback about our infographic posters, and we plan to continue to provide this informative tool to share our survey results. Look for more infographics in the near future featuring our HERI Faculty Survey and our student follow-up surveys, including the Your First College Year survey, the College Senior Survey and the Diverse Learning Environments survey. All five of our 2014 surveys are currently open for administration and registration.

We hope you find the infographic useful in stimulating discussions on campus about your TFS results.