It is travel season! Meet staff at conferences around the country and learn about CIRP and other current HERI research.

Posted by Ellen Stolzenberg on October 15th, 2014 in News, News Homepage, Uncategorized | No Comments »

While football season is well under way, it is time for conference season to begin!

Part of our mission is to produce and disseminate original research. We also help colleges and universities in using both our research findings and their own institutional data to foster institutional understanding and improvement. We will be presenting and exhibiting at several national and regional conferences over the next few months. If you are attending any of the conferences, please come by the booth or our presentation(s) to say hi. Please contact me if you would like to set up a specific time to meet (stolzenberg@gseis.ucla.edu).

IUPUI Assessment Institute
Indianapolis, IN
October 19-21, 2014
Indianapolis, IN
Calculating Persistence: Using Student Demographic and Experiential Backgrounds to Estimate First-Year Retention (Monday, October 20th, 4:30-5:30 PM)

AACRAO Strategic Enrollment Management Conference
Los Angeles, CA
October 26-28, 2014
Estimating First-Year Retention: Tools for Enrollment Management (Tuesday, October 28th, 2:15 PM)

California Association for Institutional Research (CAIR)
San Diego, CA
November 19-21, 2014
Conference Sponsor (Come meet Dominique and Ellen at the booth.)
From Administration to Z-Scores: An Overview of Survey Research (Friday, November 21st, 10:00-10:45 AM)

Association for the Study of Higher Education (ASHE)
Washington, DC
November 20-22, 2014
Stay tuned for presentation information!

Southern Association of Colleges and Schools Committee on Colleges (SACSCOC) Annual Meeting
Nashville, TN
December 6-9, 2014
Exhibitor (Ellen will be at the booth)

Assessing the Pervasiveness of Sexual Assault on College Campuses: The Importance of Data

Posted by Kevin Eagan on October 13th, 2014 in News, News Homepage, Research, Surveys | No Comments »

The past year has brought intense scrutiny of colleges and universities over the issue of sexual assault on campus, particularly since the White House issued a report last January that included an alarming statistic: one in five women are sexually assaulted while in college. As of August, the U.S. Department of Education is investigating more than 75 colleges and universities for their handling of sexual assault allegations.

This attention has prompted draft legislation and newly enacted laws to combat the reported prevalence of sexual assault on college campuses. On September 28, California Governor Jerry Brown signed a bill requiring public colleges and universities in the state to revise their rape-prevention policies to include an “affirmative consent” standard, which requires both parties engaging in sexual activity to verbally consent – passive (silent) assent is unacceptable under the new statute. In September, the Obama administration, supported by the NCAA and a number of media companies, unveiled the “It’s On Us” campaign aimed at combating sexual violence on college campuses.

New York Senator Kirsten Gillibrand and Missouri Senator Claire McCaskill continue to lead efforts on Capitol Hill to enact legislation designed to promote greater institutional accountability for handling of sexual assault allegations on campus. The draft bill would require annual anonymous surveys of all college students to provide a more complete picture of the prevalence of sexual violence on campus. The Campus SaVE (Sexual Violence Elimination) Act, passed in 2013, went into effect last week; one of the provisions of this new law requires campuses to maintain greater transparency and accountability about the reporting and handling of sexual assault allegations. Additionally, the new law includes language pertaining to same-sex sexual assault – an important step forward from previous versions of this law.

The increased media attention and new regulations raining down on institutions from local, state, and federal policymakers necessitate that colleges and universities better realize the prevalence of the issue on their campuses. Many instances of sexual assault go unreported, and it is imperative that institutional leaders improve their understanding of the broader climate on their campus.

Surveys are a key tool to explore a number of aspects of climate on college campuses. For the past five years, the Higher Education Research Institute has provided such a tool in the Diverse Learning Environments (DLE) survey. In response to the growing national conversation about sexual assault on college campuses, we have added a set of questions to the DLE that ask respondents about their experiences with unwanted sexual contact since entering their current institution. Students who report having had unwanted sexual contact see a short set of questions pertaining to the incident(s) – whether perpetrator used physical force, whether the survivor was incapacitated when the incident happened, and whether or to whom the survivor has reported the incident.

Our mission at the Higher Education Research Institute has always consisted of two parts: “to inform educational policy and promote institutional improvement through an increased understanding of higher education and its impact on college students.” These important changes to the 2014-15 DLE survey accomplish both tenets of this mission. First, we aim to provide colleges and universities with actionable information about the prevalence of sexual assault on campus. Some institutions may not be quite ready to see this information, but they would be better served to squarely address this issue before additional regulations (or sanctions) are imposed on them by policymakers.

Second, we aim to contribute to the important conversation about sexual assault on college campuses. The widely cited statistic that 20% of women experience sexual assault while in college is based on a study of two public universities; thus, although the government and media have latched on to this figure, we really do not know how representative this statistic is across institutional types, geographic regions, or student characteristics (indeed, some media have been skeptical of these numbers). The point of the Cooperative Institutional Research Program, a national longitudinal study of the American higher education system, is to aggregate data across institutions to provide, if not national, at least multi-institutional perspectives about the college experience. We hope this important addition to a survey that already focuses on other campus climate issues can further advance our objective.

"As Reauthorization Turns": Data and the Reauthorization of the Higher Education Act

Posted by Lesley McBain on September 23rd, 2014 in News, News Homepage | Comments Off

The process of reauthorizing the Higher Education Act (HEA) has once again begun in Washington, DC. While this may seem remote from survey administration and analysis, current draft bills circulating to reauthorize HEA pertain to campus-level student data in highly specific ways.

The Republican and Democratic approaches to reauthorizing HEA differ: the Republicans have offered thus far a white paper on their priorities and three separate bills on different aspects of HEA. The white paper focuses on accountability and informed decision-making by “the consumer” and calls for reforming federal education data collection to achieve this goal: “For example, information collected by IPEDS must be improved and the delivery of information streamlined to reduce confusion….The IPEDS data collection must be updated so it captures more than first-time, full-time students” (pp. 2-3).

The bill most pertinent to institutional and survey researchers is H.R. 4983, introduced by Representative Virginia Foxx (R-SC). As of July 24, the bill has passed the House of Representatives and been referred to the Senate. H.R. 4983 eliminates certain data elements from the federal College Navigator website that were signatures of the last HEA reauthorization—the college affordability and transparency lists (aka “watch list”), the state higher education spending charts, and multiyear tuition calculator—but also revamps College Navigator in many ways. These include adding links to Bureau of Labor Statistics (BLS) data on both regional and national starting salaries “in all major occupations” and links to “more detailed and disaggregated information” on institutions’ “faculty and student enrollment, completion rates, costs, and use and repayment of financial aid” (Congressional Research Service [CRS] bill summary). Detailed minimum requirements would also be imposed on the net-price calculator currently in operation, including distinguishing veterans education benefits from other financial aid and providing links to information on those benefits if the calculator does not estimate their eligibility (Congressional Research Service [CRS] bill summary).

The contrasting Democratic approach has been Senator Tom Harkin (D-IA)’s release of a draft omnibus bill reauthorizing all of HEA. Specific sections readers may find most interesting—though reading the entire 785-page bill is useful for contextualizing data issues—are Title I, Secs. 108, 109, and 113, as well as Title IX, Sec. 902. These sections relate to College Navigator and the College Scorecard (Title I, Secs. 108 & 109), a new complaint resolution and tracking system (Title I, Sec. 113), and a new data center tracking data on students with disabilities (Title IX, Sec. 901). In addition, joint letter from the American Council on Education and 20 other higher education associations suggests that a unit-record database provision will be reinserted into this draft.

The Democratic draft bill adds data to College Navigator such as “the number of returning faculty (by full-time and part-time status, tenure status, and contract length)” (Sec. 108). In addition, it majorly expands the College Scorecard to include items such as average net price broken out by enrolled students’ family income with a comparison to other institutions, completion percentages for all undergraduate certificate- or degree-seeking students, both term-to-term and year-to-year persistence percentages for all undergraduate certificate- or degree-seeking students, the percentage of students who transfer to four-year institutions from two-year institutions within 100% and 150% of normal time; comparisons to other institutions is required for all points. Institutions will be required to make their most recent College Scorecard publicly available on their website, distribute it to prospective and accepted students regardless of whether the information was requested “in a manner that allows for the student or the family of the student to take such information into account before applying or enrolling” (Sec. 133(i)(2)(B), p. 51).

The new complaint resolution and tracking system (Title I, Sec. 113) would create a new federal complaint system—and office within the Department of Education—to track and respond to complaints from students, family members of students, third parties acting on behalf of students, and staff members or employees of institutions against higher education institutions receiving funds authorized under HEA. Complaints are defined as “complaints or inquiries regarding the educational practices and services, and recruiting and marketing practices, of all postsecondary educational institutions” (Sec. 161(b)(1), p. 60). Institutions would be required to respond to the Secretary of Education no later than 60 days including what steps they have taken to respond, all responses received by the complainant, and any additional actions taken or planned in response (Sec. 161(c)(2), pp. 60- 62). The Department of Education will publish the number and type of complaints and inquiries received as well as information about their resolution; it will also be required to submit a report to authorizing committees on type and number of complaints, to what extent they have been resolved, patterns of complaint in any given sector of postsecondary institutions, legislative recommendations from the Secretary to “better assist students and families,” and the schools with the highest complaint volume as determined by the Secretary (Sec. 161(4)(d)(A-E), pp. 65-66).

The creation of the National Data Center on Higher Education and Disability (Title IX, Sec. 902) will require institutions participating in Title IV financial aid programs submit information on their programs and services for students with disabilities, including “individual-level, de-identified data describing services and accommodations provided to students with disabilities, as well as the retention and graduation rates of students with disabilities who sought disability services and accommodations from the institution of higher education” (Sec. 903(b)(6), p. 611) and “shall collect, organize, and submit such data in a way that supports disaggregation” (Sec. 903(c), p. 611) by 13 specified disability categories (Sec. 903(a), p. 610). This data will be made available to the public.

As always, the road to HEA reauthorization is long and winding. While Senator Harkin recently mentioned trying to move forward on HEA during the lame duck session of Congress, it is by no means certain. The issues of data collection raised by draft provisions on both the Republican and Democratic sides, however, are crucial both in research and practice. What data are most useful to parents, students, and other higher education stakeholders who need data to better serve their student and institutional populations? Are those even the same data points? What purpose—or how many purposes—should federal postsecondary education data collection serve? How much data collection is too much data collection? What privacy issues are raised by proposed HEA provisions? Is federal legislation even the place to answer these questions?

The best advice we can offer is to stay tuned and stay informed. Researchers and survey administrators/analysts have not only a critical stake in the outcome, but can provide an informed perspective to the various parties involved in Reauthorization.

August Poll of the Month: Using Incentives to Increase Response Rates

Posted by Kevin Eagan on September 11th, 2014 in News, News Homepage | Comments Off

Our August poll of the month asked about ways in which researchers incentivize survey participation. Survey research has demonstrated that some incentive is better than no incentive, and other studies have investigated how well certain types of incentives work.

The August poll had 83 respondents, and 60% of those who answered the poll indicated that they utilized a guaranteed incentive to increase response rates. These guaranteed incentives might be small gift cards ($5, $10), cash, or even candy.

We had one campus this past spring that used candy bars as an incentive to encourage participation in the College Senior Survey and the Your First College Year survey, and response rates for both administrations exceeded 67%! It just goes to show that providing incentives does not have to be expensive if the right marketing approach is implemented. Other campuses promise additional tickets to graduation for their senior surveys, which can be a particularly effective approach for venues with quite limited capacity.

Raffles represented the second most popular incentive used by respondents, with 40% indicating that promising entry to a raffle or drawing was once strategy they used to increase response rates. With raffles, the trick is to find the right prize that fits within the survey administration budget and piques participants’ interest. Popular raffles we see our campuses using include parking passes, iPads, and movie tickets. One strategy for survey administrators considering a raffle as an incentive is to raffle a prize every week – the earlier potential participants complete the survey, the better their odds are of winning the prize.

More than a third (36%) of respondents to the poll reported using coupons/discounts for campus or local vendors, and this approach represents an opportunity to create partnerships on and off campus. Local businesses are often looking for ways to attract students, faculty, and staff to their establishments, and this could be an effective win-win strategy for these businesses and survey administrators Likewise, surveys often have multiple stakeholders on campus, and leveraging other units’ interests in the data into tangible incentives can help create buy-in while increasing response rates.

One in ten respondents use food as an incentive, and we often see this approach used in residence halls. Resident Assistants (RAs) might host a survey pizza party, or perhaps survey researchers will host a lunch for faculty and ask them to complete the survey. Such an approach can both increase response rates and create a sense of community of campus.

We like to have fun with our monthly polls, but, to be clear, researchers have spent a great deal of time and money to understand the effectiveness of incentives at increasing response rates. For those interested, Sanchez-Fernandez et al. (2010) provide a nice overview of studies that have focused on guaranteed incentives (both pre and post), and in their own analyses they find that providing a guaranteed incentive significantly increased response rates. Porter (2004) outlines a more holistic approach to conducting an accurate, efficient, an d high-quality survey.

We hope you’ll take a couple of minutes to complete September’s poll of the month, which asks about survey marketing strategies. Check it out on our main page: www.heri.ucla.edu.

Putting Today's USNWR Rankings into Context

Posted by Kevin Eagan on September 9th, 2014 in News, News Homepage | Comments Off

Undoubtedly a number of presidents, provosts, directors of admissions, and others at elite colleges and universities are either breathing sighs of relief or wringing their hands this morning with the release of the 30th edition of the U.S. News and World Reports Best College Rankings. As has been the case since the 2011-2012 academic year, Princeton University held the top spot among national universities, with Williams atop the list for national liberal arts colleges.

For all of the hype that the media (e.g., here, here, or here) give to the rankings, a little perspective is in order. The top 20 institutions on the university list collectively enroll slightly more than 150,000 students, which is less than 1.5% of the more than 10.5 million undergraduate students enrolled in four-year degree-granting institutions nationwide. And the lists published today make no mention of the more than 1,500 community colleges currently serving more than 7 million undergraduates nationwide. Indeed, these two-year institutions are educating the most racially and economically diverse cross-sections of college students.

Data from the CIRP Freshman Survey would suggest that the attention given to the rankings each year (and, yes, it’s clear that perhaps even writing this blog is contributing to the hype) is unwarranted, as national rankings appear to matter little in students’ college choice process. When we break out the data by selectivity, we find that rankings in national magazines only matter for those students at the most selective campuses. Just less than a quarter (24%) of students at the most selective campuses indicated that rankings in national magazines was a “very important” factor in their decision to enroll at their current institution. By contrast, 10% and 11% of students attending institutions categorized as “low selectivity” or “medium selectivity” rated rankings as a “very important” factor in their decision process.

Instead, cost and financial aid are increasingly being considered as top factors in students’ college choice process, as we reported in the 2013 Freshman Survey monograph.

Given the lack of importance students place upon rankings in deciding where to attend college and the amount of time institutional research officers put into filling out the annual USNWR survey, what’s the point? Perhaps the rankings help those at the top attract greater numbers of donors and larger gifts, which in turn facilitate their ability to rest atop these lists. But they also encourage poor behavior (a.k.a., cheating, lying) among some colleges and universities – what an example to set for students!

A perennial critique of the USNWR rankings is that the lists focus less on outcomes, although the methodology does include measures of first-year retention and six-year graduation rates. And this year, the magazine added student loan default rates and campus crime, though the editor of the magazine downplayed the latter statistic and suggested that readers should not place much stock in campus crime rates.

The Obama administration is proposing a rating system of its own that focuses more on outcomes, and the details of this proposal are expected to be released later this fall. HERI Director Sylvia Hurtado joined a panel, hosted by UCLA’s Civil Rights Project, on Capitol Hill last week to advocate for fairness in the measures used. Sylvia presented CIRP data highlighting the importance of using input-adjusted measures to account for differentials in student preparation and campus resources when measuring outcomes like retention and degree completion. When we look at the distribution of campuses that do better than expected given the students they serve and the resources at their disposable, we tend to see a very different ranked list.

CIRP Surveys and Accreditation: SACS Guide Updated for 2014

Posted by Ellen Stolzenberg on September 2nd, 2014 in News, News Homepage | No Comments »

Accreditation remains a driving force behind survey use on college campuses. Because CIRP surveys are comprehensive and designed to be longitudinal, they cover a wide variety of outcomes related to student growth and development. CIRP survey data can be used throughout the accreditation process—to engage in institutional self-study, to inform a visit by evaluators, and to respond to a decision handed down by a regional accreditor. The CIRP Accreditation Guides are designed to facilitate using the surveys in the accreditation process.

Each institution approaches accreditation differently—taking into account its mission, goals, programs, policies, and the composition of the faculty and student bodies. One shared element, however, is to understand how the practices and data already available on campus align with accreditation standards. The CIRP Accreditation Guides demonstrate how items from all 5 CIRP surveys (TFS, YFCY, DLE,CSS, and FAC) connect to standards for each of the regional accrediting bodies. For those that are considering future survey participation, sample timelines help institutions decide when and how often to gather evidence for use in accreditation.

We are in the process of updating all of the guides to correspond to the 2014 CIRP Survey instruments and the updated regional accreditation standards. The Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) has updated their accreditation standards in and the 2014 CIRP Accreditation Guide for SACS reflects those changes, in relation to the 2014 survey instruments.

The WSCUC (formerly WASC) guide was updated in May 2014 and the guides for the other accrediting bodies will be updated in the coming months. Previous versions of these guides are available now.

Higher Learning Commission-North Central Association (HLC-NCA)
Middle States Commission on Higher Education (MSCHE)
New England Association of Schools and Colleges (NEASC)
Northwest Commission on Colleges and Universities (NWCCU)
WASC Senior College and University Commission (WSCUC) (Updated May 2014)

Please visit our accreditation webpage, where we highlight examples of how institutions have used CIRP surveys in their accreditation efforts.

The Road from Serres: A Feminist Odyssey by Helen Stavridou Astin

Posted by Ellen Stolzenberg on August 27th, 2014 in News, News Homepage | Comments Off

“You have to go there. The Astins are there!”

This was how I first became aware of Lena and Sandy Astin. I volunteered as an orientation counselor for incoming students at the start of my senior year at Tulane University in 1996. When I expressed an interest in working with college students, the director of the orientation program suggested I look into the emerging field of student affairs. When I told her of a new MA program at UCLA, she said “You have to go there. The Astins are there.”

She was right. I had the opportunity to take a student development class with Lena during the 97-98 school year. I was immediately captivated, simultaneously drawn in by her warmth and intimidated by her intellect and presence. The intimidation immediately became admiration and respect. I learned so much, both in and out of the classroom. To be perfectly honest, it was the lowest grade I received in graduate school, but it filled me with confidence and the desire to go beyond the master’s degree.

In The Road from Serres: A Feminist Odyssey, Lena (to those who know her) tells her incredible story: from the tumultuous nature of growing up in German-occupied Greece during World War II and the leap of faith she took in pursuing a scholarship to complete her undergraduate degree in the United States to her pioneering doctoral studies at the University of Maryland and groundbreaking career.

It was at the University of Maryland that she met fellow graduate student, Alexander (Sandy) Astin, her husband of 58 years. Lena shared the challenges and triumphs of both her career and her family, as a proud mom of two boys and grandmother to three beautiful granddaughters. Moving the family and their research to Los Angeles (and UCLA) in the 1970s is a definite turning point. As the co-founder of the Cooperative Institutional Research Program (CIRP) and now Senior Scholar and Distinguished Professor Emeritus of Education, her legacy and connection to CIRP and UCLA continue.

If you know Lena, this memoir reads like an audio book. You can hear her telling all of these stories, some of which she readily shared with her students. Even if you haven’t met her, her spirit is evident in this memoir. Her family is first and foremost in her life. The common thread throughout the book and the essence of Lena as a mentor, professor, and scholar is love. She is the epitome of strength, determination and caring, qualities that weren’t (and often still aren’t) necessarily viewed as assets in academia.

When I returned for the PhD in Higher Education and Organizational Change in 2001, Lena and Sandy were beginning to retire. (Though, knowing them, they will never completely retire.) During the 2 ½ years I worked at HERI during my doctoral program, Lena was always willing to give advice or lend an ear, whether it was about CIRP, my dissertation, or most importantly, my family.

I’ve never shared this with her, but the way she speaks of, and interacts with, her family reminds me very much of my grandmother, who passed away the first week of my doctoral program. One of my favorite Lena moments occurred at the CIRP 40th Anniversary, at which Lena and Sandy were honored by world-renowned scholars in higher education from UCLA and elsewhere. I happened to be sitting behind Lena when her young granddaughters ran in. Her face lit up and the fact that dozens of higher education leaders were there to honor her was irrelevant because her little girls were there. I will never forget that look of pure joy.

The Road from Serres: A Feminist Odyssey provides great insight into Lena’s past and present. She is honest and unfiltered. My admiration continues.

I returned to HERI as the Assistant Director for CIRP nearly six months ago. Lena was one of the first to offer her congratulations.

Results from the July Poll of the Month: Mode of Survey Administration

Posted by Kevin Eagan on August 4th, 2014 in News, News Homepage, Surveys | Comments Off

In HERI’s first-ever Poll of the Month in July, we asked folks about their preferred method of survey administration. We provided three options: paper only, web only, or a combination of both web and paper. In total, we had 59 respondents to our little poll, and the distribution of responses was as follows:

15% reported only using paper surveys
36% rely only on web administrations
Nearly half (49%) incorporate a combination of web and survey administrations into their cycle

For most of our surveys at the Cooperative Institutional Research Program (CIRP), we offer both web and paper options (this past year was an obvious exception for YFCY and CSS). Paper surveys, when administered to a captive audience (i.e., in a proctored setting), represent one of the best methods for ensuring a high response rate. We find that roughly 75% of our campuses that participate in the CIRP Freshman Survey utilize a paper administration, and the vast majority of these schools make the Freshman Norms every year (surpassing a 65% participation rate)! By contrast, just under one-third of “web-only” schools reach this same threshold for the CIRP Freshman Survey.

Administering surveys on paper to freshmen during orientation sessions or to seniors during graduation rehearsal or cap and gown distribution can certainly help to create a norming culture where participation is not only encouraged but also expected. Not all campuses, however, have the personnel resources or the time during orientation or the academic year to allow for an in-person paper administration. Web surveys provide a flexible alternative to a paper survey administration and also represent a “greener” administration option, which we’re finding many students appreciate. Two of CIRP’s surveys – the Diverse Learning Environments survey and the HERI Faculty Survey – are web only instruments, as those surveys provide opportunities for campuses to customize surveys with elective modules.

Campuses using web surveys, however, need to give strong consideration to a number of factors. How good are the email addresses students have on file with the institution? Is the site from which the emails are being sent whitelisted with the campus? Is there an incentive to encourage participation? What additional marketing tools are available to help get the word out about the survey beyond email?

With all of our web surveys, we allow campuses to upload two sets of email addresses to increase the likelihood that the email invitation for the survey reaches students. We also outline specific anti-spam instructions for reps to pass along to the information technology departments to ensure that survey invitations coming from Qualtrics (the vendor we now use for all web surveys except TFS) or DRC (our TFS vendor) do not get tied up in university spam filters. Incentives are certainly key in any survey administration but especially so in web surveys, as students (and faculty!) are bombarded with survey invitations from different units on campus as well as from marketing agencies. The poll of the month for August asks about specific kinds of incentives you use for your surveys, so take a few minutes to respond.

Additional marketing tools for web surveys are also incredibly important to consider. In the event that email invitations are getting tied up in potential respondents’ personal spam filters, having alternative strategies in place to let your sample know about the survey is key. We have seen folks rely on CIRP Infographics, post billboards on buses, and have scrolls in the dining hall as effective advertising methods. Don’t overlook the importance of marketing if choosing to do a web-only survey administration.

The combination of web and paper administration, particularly in the same survey, provides campuses with incredible flexibilities. If you start with web surveys, you can send a more targeted set of paper surveys to non-respondents. Obviously the reverse could also work effectively – following up with students who did not submit a paper survey with a targeted set of emails requesting they complete the online version of the instrument. We offer this hybrid option for the CIRP Freshman Survey, College Senior Survey, and Your First College Year survey.

Regardless of method of administration, it is important to ensure proper planning has been done and that personnel are available to assist in the administration process. Hopefully with the right combination of paper/web administration and early and often marketing, you will see those response rates inch higher! Just a reminder, you can go ahead and register for a paper, web, or hybrid administration of the 2014 CIRP Freshman Survey right now, and registration for the YFCY, CSS, and DLE will open in mid-September.

Why Are They Leaving? Understanding Retention with CIRP Data

Posted by Kevin Eagan on July 15th, 2014 in News, News Homepage, Surveys | Comments Off

We’ve been thinking quite a bit about retention at the Higher Education Research Institute this summer. We have the Retention and Persistence Institute just two weeks away (July 29-30), and we unveiled a couple of new tools in beta stage at the Association for Institutional Research (AIR) annual forum in Orlando earlier this summer. Additionally, last week’s article in Inside Higher Ed on the decline in first-year retention rates nationwide caught our attention.

The article reviewed a report released from the National Student Clearinghouse that shows first-year persistence rates (reenrollment at any institution for the second fall term) had declined 1.2 percentage points across all sectors since 2009 while the first-year retention rate (reenrollment in the second fall term at the same institution had remained virtually unchanged since 2009.

Notably, the report provides descriptive information about persistence and retention; however, the report and its associated article offer little insight about why students are not reenrolling for their second fall term – either at their native institution or at any higher education institution. Similarly, the report leaves open the question about the kinds of students who are not persisting in higher education or are not being retained at their home campuses.

These open questions prompted us at HERI to look at a few of the tools and data provided by the  Cooperative Institutional Research Program to understand better the kinds of students who are leaving higher education and the reasons for their departure.  At AIR this past May, Adriana Ruiz and I presented on a new first-year retention calculator tool that we will be including in the 2014 CIRP Freshman Survey reporting package this year. The calculator follows a similar format and line of research to the graduation rate calculator we introduced to the CIRP TFS reporting package a few years ago.

The first-year retention calculator enables campuses to estimate their expected first-year retention rate based on a set of incoming student characteristics, collected from the CIRP Freshman Survey known to predict retention. Campuses can then compare their expected rates to their actual rates to benchmark whether they are performing better (or worse) than anticipated. The most salient predictors of first-year retention in the model we built for this calculator included the extent to which students felt depressed (negative predictor), self-rated emotional health (positive predictor), having an expectation to transfer (negative predictor), and entering college with major concerns about their ability to finance their college education (negative predictor).  Notably, students who express an inclination toward transfer when taking the CIRP TFS during orientation or the first few weeks of their fall term are the ones most likely not to reenroll for the fall of their second year.

Financial aid measures (grants, parental resources positively predicted retention while relying more heavily on loans negatively predicted retention), pre-college preparation measures (higher high school GPA SAT scores and SAT scores were both positive predictors), and having chosen the particular institution based upon its cost of attendance (positive predictor) also significantly and substantively predicted whether students returned to their home institution for the fall of their sophomore year. This model was built based on more than 210,000 respondents to the 2004 CIRP Freshman Survey across 356 colleges and universities. We matched students’ TFS data with enrollment and completion data from the National Student Clearinghouse.

The Cooperative Institutional Research Program’s (CIRP) Your First College Year survey has, for the last several years, included a retention module with the online version of the instrument. Although the sample size for this module tends to be small, given that most campuses opt for our paper instrument or choose to survey only those students still enrolled at the institution during the time of administration, some of the top reasons for leaving are instructive.

Among the 246 students who completed the retention module in 2011, 40.3% indicated a lack of community as being “very important” in their decision to leave. More than one-third (36.2%) noted that their financial aid package’s inadequacy was a “very important” factor in their departure decision, and 41.7% said their inability to afford college was a “very important” reason for leaving. More than 40% (40.2%) said they left in part due to their preferred major not being offered while more than one-third (35.2%) noted academic difficulties related to academic probation, suspension, or expulsion as very important reasons for leaving.

It’s clear that the reasons for leaving college are as diverse as the students who leave, but we consistently see academic and financial difficulties as key factors in students’ departure decisions. Coupled with the findings from the first-year retention calculator, it’s clear that prior preparation and financial aid/college cost will continue to have outsized roles in determining how successful campuses are in retaining students – both into the second year and through to degree completion.

HERI Poll of the Month

Posted by Kevin Eagan on July 2nd, 2014 in News, News Homepage | Comments Off

You may have noticed a new feature that we added to the web site this week: the HERI Poll of the Month. We will change the question each month and post a short blog about the previous month’s results. This is one way we hope not only to connect with our CIRP representatives, affiliated scholars, fellow researchers, and other members of the community but also to learn more about the ever-growing CIRP family.

This month’s poll asks about how you prefer to survey students (web, paper, or a combination of web and paper). Please don’t be alarmed – we have no plans to do away with paper surveys. In fact, the 2015 Your First College Year survey and the 2014-15 College Senior Survey will once again have a paper option. We are in the final negotiation phase with a vendor for these surveys, and we continue to offer a paper option for the CIRP Freshman Survey as we have for the past 49 years. (Next year is our big 50th…stayed tuned for more details!)

So take a few seconds and tell participate in the poll (it’s good for survey karma), and be sure to check back in early August for the results.