Same-Sex Marriage Support Nearly Universal Among Entering College Students

Posted by Kevin Eagan on October 27th, 2014 in Conferences, News, News Homepage | No Comments »

This blog is cross-posted at The Huffington Post:

The national landscape for marriage equality has changed considerably in the past month. On Oct. 6, the U.S. Supreme Court declined to hear appeals on five different cases challenging lower courts’ rulings that found same-sex marriage bans to be unconstitutional. The decision paved the way for same-sex marriage in five states immediately (Oklahoma, Virginia, Utah, Wisconsin, and Indiana). Just a few days later, Idaho and Nevada joined the growing number of states allowing same-sex marriage. On Oct. 17, same-sex marriage bans in Alaska and Arizona fell with Wyoming following suit just days later.

Ted Olson, one of the lawyers in the landmark “Proposition 8″ Supreme Court decision (Hollingsworth v. Perry), declared today that the “point of no return” on gay marriage has now passed. Indeed, it seems clear that the U.S. Supreme Court decision is signaling to the lower courts that it will not take up the issue of same-sex marriage any time soon, particularly if the lower courts continue striking down state marriage bans for same-sex couples.

As these state bans continue to fall, the federal government has announced that it would immediately begin recognizing same-sex marriages in all of 33 states. This decision follows the U.S. Supreme Court decision on the Defense of Marriage Act (DOMA) in 2013 (United States v. Windsor), which held that denying benefits to married same-sex couples was unconstitutional.

It is hard to believe that Congress enacted DOMA less than two decades ago. Right after that law went into effect, the Cooperative Institutional Research Program (CIRP) Freshman Survey at UCLA’s Higher Education Research Institute began asking incoming freshmen their views on same-sex marriage. Since CIRP first started asking the question in 1997, a majority of incoming college students have agreed that same-sex couples have a legal right to marry; however, it is remarkable how strongly incoming students now endorse this position. The CIRP Freshman Survey last asked this question in 2012, and three-quarters of first-time, full-time students (75.1 percent) agreed that same-sex couples have a legal right to marry, and the data suggest that nearly all (91.1 percent) of students who identify as “liberal” or “far left” hold this view.

2014-10-27-SameSexMarriageSupport.png

Support of same-sex marriage among “conservative” and “far right” students has increased more than 20 percentage points since the question first appeared on the CIRP Freshman Survey. A near majority (46.4 percent) of students who identify their political ideology as “conservative” or “far right” now agree that same-sex couples should be allowed to legally marry.

The largest gains in support of same-sex marriage have been among incoming students who identify their political ideology as “middle-of-the-road.” In 1997, a bare majority (51.5 percent) believed same-sex couples should be permitted to marry. By 2008, more than two-thirds (67.7 percent) felt similarly, and that figure jumped another 10 percentages points by 2012 with 78.9 percent of “middle-of-the-road” students supporting same-sex marriage.

Today’s college students do not just support same-sex marriage; they also support allowing gay and lesbian couples to adopt. In 2013, 83.3 percent of all first-time, full-time college students agreed that gays and lesbians should have the legal right to adopt children.

Most individuals are more than mere single-issue voters, but given these numbers, it is interesting that some politicians continue to focus so heavily on social issues like same-sex marriage. The recent spate of court decisions in favor of same-sex marriage in the past two years, and particularly in the past four weeks, has caught up with public opinion. The political views of today’s college students increasingly suggest growing divide with the “culture wars” being waged by social conservatives. Candidates running for political office who continue to emphasize social questions while doing everything in their power to impede progress on an issue such as gay marriage risk alienating this large bloc of potential voters.

The question regarding support of same-sex marriage appeared again on the 2014 CIRP Freshman Survey, and we expect to see even greater support for the issue. The 2015 Freshman Survey likely will be the last time the item appears, as the data make clear that support for same-sex marriage is nearly universal among entering college students.

2014-15 College Senior Survey Registration Is Open, and Paper is Back!

Posted by Kevin Eagan on October 24th, 2014 in News, News Homepage, Uncategorized | No Comments »

Earlier this week we opened registration for the 2014-15 College Senior Survey (CSS), and we are thrilled to have re-introduced the option for a paper administration of the survey. Campuses that wish to learn more about their December graduates can register now and begin administering the survey on November 14. If you click the registration link, you will also notice the first phase of our new registration portal for the follow-up surveys. Campus representatives can use this portal to register for the survey, track survey responses, download preliminary data, and retrieve final reports.

I wanted to highlight a few important changes to the 2014-15 CSS. In our 2013-14 CSS Administration Report Form, we asked campus representatives about items they’d like to see on future CSS instruments, and we listened. For the first time in its history, the CIRP CSS includes items related to sexual orientation and gender identity – using the same items that have been on the Diverse Learning Environments (DLE) survey for the past several administrations. Additionally, we have provided a more granular approach to asking about students’ race/ethnicity; we now provide four categories for Asian students: East Asian (e.g., Chinese, Japanese, Korean, Taiwanese); Southeast Asian (e.g., Cambodian, Vietnamese, Hmong, Filipino); South Asian (e.g., Indian, Pakistani, Nepalese, Sri Lankan); and Other Asian. We also added a question asking respondents whether they identify as multiracial, similar to the DLE.

In looking at the past several years of data, we determined it was time to update the response options available in the financial aid question. The past few administrations showed that a substantial chunk of students used more than $10,000 (formerly the highest option) worth of loans or financial aid that did not need to be repaid (e.g., grants, scholarships) to cover educational expenses. The new response options provide campuses the opportunity to continue to trend the data (by combining different categories) while updating the total dollar amounts to reflect the following categories: None; $1-2,999; $3,000-5,999; $6,000-9,999; $10,000-$14,999; and $15,000+.

Another important change focused on the self-rated ability items – those pertaining to general knowledge, critical thinking skills, knowledge of a particular field or discipline, etc. Rather than ask students to rate their abilities in these areas (nationally we saw 85% of students rating themselves as strong/a major strength across the board), respondents will now assess whether they believe their undergraduate experience contributed to these abilities. Specifically, the question reads: “Please indicate your agreement with each of the following statements. This institution has contributed to my: Knowledge of a particular field or discipline; Interpersonal skills; Foreign language ability; Critical thinking skills, etc.” The response options are on an Agree Strongly to Disagree Strongly scale.

In a time where greater focus is being given to students’ experience with research during their undergraduate careers, we have added a question about the amount of time students spent doing research in college. The question reads: “How many months since entering college (including summer) did you work on a professor’s research project.” Response options range from 0 months to 25+ months. Ongoing research at HERI continues to point to the value of undergraduate research in facilitating students transition to graduate school – particularly among students in science, technology, engineering, and mathematics (STEM) majors.

We also learned that students found our new way of asking about future plans to be confusing, and we have streamlined this question. Rather than asking about primary and secondary activities, we ask students (yes/no) about their intentions to work full-time, work part-time, attend graduate school, volunteer, etc., in the fall following their completion of their undergraduate degree.

One last note that we hope everyone can appreciate: The survey is 10% shorter. We eliminated redundancies and took to heart feedback from campus representatives regarding items they found to be least helpful in their assessment efforts. We hope these changes make the CSS more inclusive, more useful, and more timely for our campuses. Registration is now open; administration begins November 14 and continues through June 26, 2015.

Marketing Campus Surveys ‐ Do More than Email!

Posted by Kevin Eagan on October 22nd, 2014 in News, News Homepage, Surveys | No Comments »

When conducting surveys on campus, researchers must give careful consideration to how they market and brand the survey. Given the trend in the past decade (or more!) of over-surveying students, faculty, and staff, finding a way to make your study stand out can be a challenge.

Our September Poll of the Month asked respondents about how they go about marketing surveys on campus. In all, we had 57 respondents to the poll. By far personalized emails were the most popular form of reaching out to students, faculty, and staff to alert them to the survey: 75% of respondents used this method. Personalized emails at minimum include a unique survey link to ‘track’ respondents as well as an introduction that addresses the respondent by name. These types of communication send the message that the individual will not continue to be bombarded with requests for this particular survey if s/he responds now or opts out of the panel.

Personalized emails also offer an opportunity for the person overseeing the administration to try to appeal in specific ways to the potential respondent by connecting the instrument to their major/department, club/group, or course. All of the CIRP surveys offer campuses with the option of contacting potential respondents through HERI-managed emails – this service tracks respondents and those who opt out and removes them from the panel.

Nearly as many respondents (70%) reported using emails sent to listservs. This form of outreach may be an easier and broader strategy to connect with the targeted audience, but it comes across as less personal and can limit researchers’ ability to track individuals who have responded. Thus, sending email blasts to listservs increases the risk of survey (and email) fatigue, particularly among individuals who have already responded.

Nearly half of respondents to the September poll (46%) use announcements in class to advertise surveys. This kind of outreach can add a personal touch to requests to participate in surveys that a faceless email cannot provide. Additionally, class announcements can be particularly effective when targeting a specific group of students. For example, using first-year seminar courses or introductory English courses might be a good strategy to use for a survey like CIRP’s Your First College Year survey, which focuses on experiences and outcomes of first-year college students. Relatedly, about a quarter (23%) of respondents to the poll reported making announcements at meetings, which help add that personal touch when surveying faculty or staff.

More than one-third of respondents (37%) rely upon flyers around campus, 19% post ads in the campus newspaper, and one in nine (11%) advertise surveys on billboards or marquees around campus. In our annual administration report form (ARF) that we send at the close of every survey, we have learned of some campuses advertising their CIRP surveys on busses – that’s one surefire why to make sure word gets around!

Others outreach to potential respondents through the residence hall staff (9%) or by signaling to students and faculty that the campus uses the results by highlighting findings through infographics (9%). We occasionally hear of campuses incentivizing resident assistants (RAs) with pizza parties awarded to the floor or area with the highest response rate. Additionally, showcasing the findings from previous surveys through a medium such as infographics signals to potential respondents that their input matters and gets used and seen by the institution.

When trying to conduct a campus-wide survey, having an effective marketing strategy will go a long way in promoting greater interest and response among targeted participants. So, for your next survey administration, consider marketing the study beyond just email – post some flyers, partner with campus housing, or print out some infographics to show your target audience just how much their input matters.

It is travel season! Meet staff at conferences around the country and learn about CIRP and other current HERI research.

Posted by Ellen Stolzenberg on October 15th, 2014 in News, News Homepage, Uncategorized | No Comments »

While football season is well under way, it is time for conference season to begin!

Part of our mission is to produce and disseminate original research. We also help colleges and universities in using both our research findings and their own institutional data to foster institutional understanding and improvement. We will be presenting and exhibiting at several national and regional conferences over the next few months. If you are attending any of the conferences, please come by the booth or our presentation(s) to say hi. Please contact me if you would like to set up a specific time to meet (stolzenberg@gseis.ucla.edu).

IUPUI Assessment Institute
Indianapolis, IN
October 19-21, 2014
Indianapolis, IN
Calculating Persistence: Using Student Demographic and Experiential Backgrounds to Estimate First-Year Retention (Monday, October 20th, 4:30-5:30 PM)

AACRAO Strategic Enrollment Management Conference
Los Angeles, CA
October 26-28, 2014
Estimating First-Year Retention: Tools for Enrollment Management (Tuesday, October 28th, 2:15 PM)

California Association for Institutional Research (CAIR)
San Diego, CA
November 19-21, 2014
Conference Sponsor (Come meet Dominique and Ellen at the booth.)
From Administration to Z-Scores: An Overview of Survey Research (Friday, November 21st, 10:00-10:45 AM)

Association for the Study of Higher Education (ASHE)
Washington, DC
November 20-22, 2014
Stay tuned for presentation information!

Southern Association of Colleges and Schools Committee on Colleges (SACSCOC) Annual Meeting
Nashville, TN
December 6-9, 2014
Exhibitor (Ellen will be at the booth)

Assessing the Pervasiveness of Sexual Assault on College Campuses: The Importance of Data

Posted by Kevin Eagan on October 13th, 2014 in News, News Homepage, Research, Surveys | No Comments »

The past year has brought intense scrutiny of colleges and universities over the issue of sexual assault on campus, particularly since the White House issued a report last January that included an alarming statistic: one in five women are sexually assaulted while in college. As of August, the U.S. Department of Education is investigating more than 75 colleges and universities for their handling of sexual assault allegations.

This attention has prompted draft legislation and newly enacted laws to combat the reported prevalence of sexual assault on college campuses. On September 28, California Governor Jerry Brown signed a bill requiring public colleges and universities in the state to revise their rape-prevention policies to include an “affirmative consent” standard, which requires both parties engaging in sexual activity to verbally consent – passive (silent) assent is unacceptable under the new statute. In September, the Obama administration, supported by the NCAA and a number of media companies, unveiled the “It’s On Us” campaign aimed at combating sexual violence on college campuses.

New York Senator Kirsten Gillibrand and Missouri Senator Claire McCaskill continue to lead efforts on Capitol Hill to enact legislation designed to promote greater institutional accountability for handling of sexual assault allegations on campus. The draft bill would require annual anonymous surveys of all college students to provide a more complete picture of the prevalence of sexual violence on campus. The Campus SaVE (Sexual Violence Elimination) Act, passed in 2013, went into effect last week; one of the provisions of this new law requires campuses to maintain greater transparency and accountability about the reporting and handling of sexual assault allegations. Additionally, the new law includes language pertaining to same-sex sexual assault – an important step forward from previous versions of this law.

The increased media attention and new regulations raining down on institutions from local, state, and federal policymakers necessitate that colleges and universities better realize the prevalence of the issue on their campuses. Many instances of sexual assault go unreported, and it is imperative that institutional leaders improve their understanding of the broader climate on their campus.

Surveys are a key tool to explore a number of aspects of climate on college campuses. For the past five years, the Higher Education Research Institute has provided such a tool in the Diverse Learning Environments (DLE) survey. In response to the growing national conversation about sexual assault on college campuses, we have added a set of questions to the DLE that ask respondents about their experiences with unwanted sexual contact since entering their current institution. Students who report having had unwanted sexual contact see a short set of questions pertaining to the incident(s) – whether perpetrator used physical force, whether the survivor was incapacitated when the incident happened, and whether or to whom the survivor has reported the incident.

Our mission at the Higher Education Research Institute has always consisted of two parts: “to inform educational policy and promote institutional improvement through an increased understanding of higher education and its impact on college students.” These important changes to the 2014-15 DLE survey accomplish both tenets of this mission. First, we aim to provide colleges and universities with actionable information about the prevalence of sexual assault on campus. Some institutions may not be quite ready to see this information, but they would be better served to squarely address this issue before additional regulations (or sanctions) are imposed on them by policymakers.

Second, we aim to contribute to the important conversation about sexual assault on college campuses. The widely cited statistic that 20% of women experience sexual assault while in college is based on a study of two public universities; thus, although the government and media have latched on to this figure, we really do not know how representative this statistic is across institutional types, geographic regions, or student characteristics (indeed, some media have been skeptical of these numbers). The point of the Cooperative Institutional Research Program, a national longitudinal study of the American higher education system, is to aggregate data across institutions to provide, if not national, at least multi-institutional perspectives about the college experience. We hope this important addition to a survey that already focuses on other campus climate issues can further advance our objective.

"As Reauthorization Turns": Data and the Reauthorization of the Higher Education Act

Posted by Lesley McBain on September 23rd, 2014 in News, News Homepage | Comments Off

The process of reauthorizing the Higher Education Act (HEA) has once again begun in Washington, DC. While this may seem remote from survey administration and analysis, current draft bills circulating to reauthorize HEA pertain to campus-level student data in highly specific ways.

The Republican and Democratic approaches to reauthorizing HEA differ: the Republicans have offered thus far a white paper on their priorities and three separate bills on different aspects of HEA. The white paper focuses on accountability and informed decision-making by “the consumer” and calls for reforming federal education data collection to achieve this goal: “For example, information collected by IPEDS must be improved and the delivery of information streamlined to reduce confusion….The IPEDS data collection must be updated so it captures more than first-time, full-time students” (pp. 2-3).

The bill most pertinent to institutional and survey researchers is H.R. 4983, introduced by Representative Virginia Foxx (R-SC). As of July 24, the bill has passed the House of Representatives and been referred to the Senate. H.R. 4983 eliminates certain data elements from the federal College Navigator website that were signatures of the last HEA reauthorization—the college affordability and transparency lists (aka “watch list”), the state higher education spending charts, and multiyear tuition calculator—but also revamps College Navigator in many ways. These include adding links to Bureau of Labor Statistics (BLS) data on both regional and national starting salaries “in all major occupations” and links to “more detailed and disaggregated information” on institutions’ “faculty and student enrollment, completion rates, costs, and use and repayment of financial aid” (Congressional Research Service [CRS] bill summary). Detailed minimum requirements would also be imposed on the net-price calculator currently in operation, including distinguishing veterans education benefits from other financial aid and providing links to information on those benefits if the calculator does not estimate their eligibility (Congressional Research Service [CRS] bill summary).

The contrasting Democratic approach has been Senator Tom Harkin (D-IA)’s release of a draft omnibus bill reauthorizing all of HEA. Specific sections readers may find most interesting—though reading the entire 785-page bill is useful for contextualizing data issues—are Title I, Secs. 108, 109, and 113, as well as Title IX, Sec. 902. These sections relate to College Navigator and the College Scorecard (Title I, Secs. 108 & 109), a new complaint resolution and tracking system (Title I, Sec. 113), and a new data center tracking data on students with disabilities (Title IX, Sec. 901). In addition, joint letter from the American Council on Education and 20 other higher education associations suggests that a unit-record database provision will be reinserted into this draft.

The Democratic draft bill adds data to College Navigator such as “the number of returning faculty (by full-time and part-time status, tenure status, and contract length)” (Sec. 108). In addition, it majorly expands the College Scorecard to include items such as average net price broken out by enrolled students’ family income with a comparison to other institutions, completion percentages for all undergraduate certificate- or degree-seeking students, both term-to-term and year-to-year persistence percentages for all undergraduate certificate- or degree-seeking students, the percentage of students who transfer to four-year institutions from two-year institutions within 100% and 150% of normal time; comparisons to other institutions is required for all points. Institutions will be required to make their most recent College Scorecard publicly available on their website, distribute it to prospective and accepted students regardless of whether the information was requested “in a manner that allows for the student or the family of the student to take such information into account before applying or enrolling” (Sec. 133(i)(2)(B), p. 51).

The new complaint resolution and tracking system (Title I, Sec. 113) would create a new federal complaint system—and office within the Department of Education—to track and respond to complaints from students, family members of students, third parties acting on behalf of students, and staff members or employees of institutions against higher education institutions receiving funds authorized under HEA. Complaints are defined as “complaints or inquiries regarding the educational practices and services, and recruiting and marketing practices, of all postsecondary educational institutions” (Sec. 161(b)(1), p. 60). Institutions would be required to respond to the Secretary of Education no later than 60 days including what steps they have taken to respond, all responses received by the complainant, and any additional actions taken or planned in response (Sec. 161(c)(2), pp. 60- 62). The Department of Education will publish the number and type of complaints and inquiries received as well as information about their resolution; it will also be required to submit a report to authorizing committees on type and number of complaints, to what extent they have been resolved, patterns of complaint in any given sector of postsecondary institutions, legislative recommendations from the Secretary to “better assist students and families,” and the schools with the highest complaint volume as determined by the Secretary (Sec. 161(4)(d)(A-E), pp. 65-66).

The creation of the National Data Center on Higher Education and Disability (Title IX, Sec. 902) will require institutions participating in Title IV financial aid programs submit information on their programs and services for students with disabilities, including “individual-level, de-identified data describing services and accommodations provided to students with disabilities, as well as the retention and graduation rates of students with disabilities who sought disability services and accommodations from the institution of higher education” (Sec. 903(b)(6), p. 611) and “shall collect, organize, and submit such data in a way that supports disaggregation” (Sec. 903(c), p. 611) by 13 specified disability categories (Sec. 903(a), p. 610). This data will be made available to the public.

As always, the road to HEA reauthorization is long and winding. While Senator Harkin recently mentioned trying to move forward on HEA during the lame duck session of Congress, it is by no means certain. The issues of data collection raised by draft provisions on both the Republican and Democratic sides, however, are crucial both in research and practice. What data are most useful to parents, students, and other higher education stakeholders who need data to better serve their student and institutional populations? Are those even the same data points? What purpose—or how many purposes—should federal postsecondary education data collection serve? How much data collection is too much data collection? What privacy issues are raised by proposed HEA provisions? Is federal legislation even the place to answer these questions?

The best advice we can offer is to stay tuned and stay informed. Researchers and survey administrators/analysts have not only a critical stake in the outcome, but can provide an informed perspective to the various parties involved in Reauthorization.

August Poll of the Month: Using Incentives to Increase Response Rates

Posted by Kevin Eagan on September 11th, 2014 in News, News Homepage | Comments Off

Our August poll of the month asked about ways in which researchers incentivize survey participation. Survey research has demonstrated that some incentive is better than no incentive, and other studies have investigated how well certain types of incentives work.

The August poll had 83 respondents, and 60% of those who answered the poll indicated that they utilized a guaranteed incentive to increase response rates. These guaranteed incentives might be small gift cards ($5, $10), cash, or even candy.

We had one campus this past spring that used candy bars as an incentive to encourage participation in the College Senior Survey and the Your First College Year survey, and response rates for both administrations exceeded 67%! It just goes to show that providing incentives does not have to be expensive if the right marketing approach is implemented. Other campuses promise additional tickets to graduation for their senior surveys, which can be a particularly effective approach for venues with quite limited capacity.

Raffles represented the second most popular incentive used by respondents, with 40% indicating that promising entry to a raffle or drawing was once strategy they used to increase response rates. With raffles, the trick is to find the right prize that fits within the survey administration budget and piques participants’ interest. Popular raffles we see our campuses using include parking passes, iPads, and movie tickets. One strategy for survey administrators considering a raffle as an incentive is to raffle a prize every week – the earlier potential participants complete the survey, the better their odds are of winning the prize.

More than a third (36%) of respondents to the poll reported using coupons/discounts for campus or local vendors, and this approach represents an opportunity to create partnerships on and off campus. Local businesses are often looking for ways to attract students, faculty, and staff to their establishments, and this could be an effective win-win strategy for these businesses and survey administrators Likewise, surveys often have multiple stakeholders on campus, and leveraging other units’ interests in the data into tangible incentives can help create buy-in while increasing response rates.

One in ten respondents use food as an incentive, and we often see this approach used in residence halls. Resident Assistants (RAs) might host a survey pizza party, or perhaps survey researchers will host a lunch for faculty and ask them to complete the survey. Such an approach can both increase response rates and create a sense of community of campus.

We like to have fun with our monthly polls, but, to be clear, researchers have spent a great deal of time and money to understand the effectiveness of incentives at increasing response rates. For those interested, Sanchez-Fernandez et al. (2010) provide a nice overview of studies that have focused on guaranteed incentives (both pre and post), and in their own analyses they find that providing a guaranteed incentive significantly increased response rates. Porter (2004) outlines a more holistic approach to conducting an accurate, efficient, an d high-quality survey.

We hope you’ll take a couple of minutes to complete September’s poll of the month, which asks about survey marketing strategies. Check it out on our main page: www.heri.ucla.edu.

Putting Today's USNWR Rankings into Context

Posted by Kevin Eagan on September 9th, 2014 in News, News Homepage | Comments Off

Undoubtedly a number of presidents, provosts, directors of admissions, and others at elite colleges and universities are either breathing sighs of relief or wringing their hands this morning with the release of the 30th edition of the U.S. News and World Reports Best College Rankings. As has been the case since the 2011-2012 academic year, Princeton University held the top spot among national universities, with Williams atop the list for national liberal arts colleges.

For all of the hype that the media (e.g., here, here, or here) give to the rankings, a little perspective is in order. The top 20 institutions on the university list collectively enroll slightly more than 150,000 students, which is less than 1.5% of the more than 10.5 million undergraduate students enrolled in four-year degree-granting institutions nationwide. And the lists published today make no mention of the more than 1,500 community colleges currently serving more than 7 million undergraduates nationwide. Indeed, these two-year institutions are educating the most racially and economically diverse cross-sections of college students.

Data from the CIRP Freshman Survey would suggest that the attention given to the rankings each year (and, yes, it’s clear that perhaps even writing this blog is contributing to the hype) is unwarranted, as national rankings appear to matter little in students’ college choice process. When we break out the data by selectivity, we find that rankings in national magazines only matter for those students at the most selective campuses. Just less than a quarter (24%) of students at the most selective campuses indicated that rankings in national magazines was a “very important” factor in their decision to enroll at their current institution. By contrast, 10% and 11% of students attending institutions categorized as “low selectivity” or “medium selectivity” rated rankings as a “very important” factor in their decision process.

Instead, cost and financial aid are increasingly being considered as top factors in students’ college choice process, as we reported in the 2013 Freshman Survey monograph.

Given the lack of importance students place upon rankings in deciding where to attend college and the amount of time institutional research officers put into filling out the annual USNWR survey, what’s the point? Perhaps the rankings help those at the top attract greater numbers of donors and larger gifts, which in turn facilitate their ability to rest atop these lists. But they also encourage poor behavior (a.k.a., cheating, lying) among some colleges and universities – what an example to set for students!

A perennial critique of the USNWR rankings is that the lists focus less on outcomes, although the methodology does include measures of first-year retention and six-year graduation rates. And this year, the magazine added student loan default rates and campus crime, though the editor of the magazine downplayed the latter statistic and suggested that readers should not place much stock in campus crime rates.

The Obama administration is proposing a rating system of its own that focuses more on outcomes, and the details of this proposal are expected to be released later this fall. HERI Director Sylvia Hurtado joined a panel, hosted by UCLA’s Civil Rights Project, on Capitol Hill last week to advocate for fairness in the measures used. Sylvia presented CIRP data highlighting the importance of using input-adjusted measures to account for differentials in student preparation and campus resources when measuring outcomes like retention and degree completion. When we look at the distribution of campuses that do better than expected given the students they serve and the resources at their disposable, we tend to see a very different ranked list.

CIRP Surveys and Accreditation: SACS Guide Updated for 2014

Posted by Ellen Stolzenberg on September 2nd, 2014 in News, News Homepage | No Comments »

Accreditation remains a driving force behind survey use on college campuses. Because CIRP surveys are comprehensive and designed to be longitudinal, they cover a wide variety of outcomes related to student growth and development. CIRP survey data can be used throughout the accreditation process—to engage in institutional self-study, to inform a visit by evaluators, and to respond to a decision handed down by a regional accreditor. The CIRP Accreditation Guides are designed to facilitate using the surveys in the accreditation process.

Each institution approaches accreditation differently—taking into account its mission, goals, programs, policies, and the composition of the faculty and student bodies. One shared element, however, is to understand how the practices and data already available on campus align with accreditation standards. The CIRP Accreditation Guides demonstrate how items from all 5 CIRP surveys (TFS, YFCY, DLE,CSS, and FAC) connect to standards for each of the regional accrediting bodies. For those that are considering future survey participation, sample timelines help institutions decide when and how often to gather evidence for use in accreditation.

We are in the process of updating all of the guides to correspond to the 2014 CIRP Survey instruments and the updated regional accreditation standards. The Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) has updated their accreditation standards in and the 2014 CIRP Accreditation Guide for SACS reflects those changes, in relation to the 2014 survey instruments.

The WSCUC (formerly WASC) guide was updated in May 2014 and the guides for the other accrediting bodies will be updated in the coming months. Previous versions of these guides are available now.

Higher Learning Commission-North Central Association (HLC-NCA)
Middle States Commission on Higher Education (MSCHE)
New England Association of Schools and Colleges (NEASC)
Northwest Commission on Colleges and Universities (NWCCU)
WASC Senior College and University Commission (WSCUC) (Updated May 2014)

Please visit our accreditation webpage, where we highlight examples of how institutions have used CIRP surveys in their accreditation efforts.

The Road from Serres: A Feminist Odyssey by Helen Stavridou Astin

Posted by Ellen Stolzenberg on August 27th, 2014 in News, News Homepage | Comments Off

“You have to go there. The Astins are there!”

This was how I first became aware of Lena and Sandy Astin. I volunteered as an orientation counselor for incoming students at the start of my senior year at Tulane University in 1996. When I expressed an interest in working with college students, the director of the orientation program suggested I look into the emerging field of student affairs. When I told her of a new MA program at UCLA, she said “You have to go there. The Astins are there.”

She was right. I had the opportunity to take a student development class with Lena during the 97-98 school year. I was immediately captivated, simultaneously drawn in by her warmth and intimidated by her intellect and presence. The intimidation immediately became admiration and respect. I learned so much, both in and out of the classroom. To be perfectly honest, it was the lowest grade I received in graduate school, but it filled me with confidence and the desire to go beyond the master’s degree.

In The Road from Serres: A Feminist Odyssey, Lena (to those who know her) tells her incredible story: from the tumultuous nature of growing up in German-occupied Greece during World War II and the leap of faith she took in pursuing a scholarship to complete her undergraduate degree in the United States to her pioneering doctoral studies at the University of Maryland and groundbreaking career.

It was at the University of Maryland that she met fellow graduate student, Alexander (Sandy) Astin, her husband of 58 years. Lena shared the challenges and triumphs of both her career and her family, as a proud mom of two boys and grandmother to three beautiful granddaughters. Moving the family and their research to Los Angeles (and UCLA) in the 1970s is a definite turning point. As the co-founder of the Cooperative Institutional Research Program (CIRP) and now Senior Scholar and Distinguished Professor Emeritus of Education, her legacy and connection to CIRP and UCLA continue.

If you know Lena, this memoir reads like an audio book. You can hear her telling all of these stories, some of which she readily shared with her students. Even if you haven’t met her, her spirit is evident in this memoir. Her family is first and foremost in her life. The common thread throughout the book and the essence of Lena as a mentor, professor, and scholar is love. She is the epitome of strength, determination and caring, qualities that weren’t (and often still aren’t) necessarily viewed as assets in academia.

When I returned for the PhD in Higher Education and Organizational Change in 2001, Lena and Sandy were beginning to retire. (Though, knowing them, they will never completely retire.) During the 2 ½ years I worked at HERI during my doctoral program, Lena was always willing to give advice or lend an ear, whether it was about CIRP, my dissertation, or most importantly, my family.

I’ve never shared this with her, but the way she speaks of, and interacts with, her family reminds me very much of my grandmother, who passed away the first week of my doctoral program. One of my favorite Lena moments occurred at the CIRP 40th Anniversary, at which Lena and Sandy were honored by world-renowned scholars in higher education from UCLA and elsewhere. I happened to be sitting behind Lena when her young granddaughters ran in. Her face lit up and the fact that dozens of higher education leaders were there to honor her was irrelevant because her little girls were there. I will never forget that look of pure joy.

The Road from Serres: A Feminist Odyssey provides great insight into Lena’s past and present. She is honest and unfiltered. My admiration continues.

I returned to HERI as the Assistant Director for CIRP nearly six months ago. Lena was one of the first to offer her congratulations.