2014 College Senior Survey (CSS) Research Brief and Infographic Release

Posted by Lesley McBain on December 15th, 2014 in News, News Homepage, Surveys | Comments Off

The 2014 College Senior Survey (CSS) research brief and infographic present findings from a nationally administered survey on students’ college experiences, future goals, and career plans; this iteration was administered to over 13,000 graduating students at 74 colleges and universities. The 2014 CSS brief focuses on how institutions support students’ personal and professional development as well as academic and intellectual growth.

Results highlight slightly improving job prospects for graduating students. Among 2014 college graduates planning to work full time, 41.4% had already secured an employment offer compared to 36.7% of those who graduated with the class of 2009. For students who had participated in an internship program during college, a higher percentage (48.1%) had already secured an employment offer.

Additionally, career services offices connect students with important information, resources and support that may aid their job search. Among graduating seniors who rated their institution’s career service office, nearly 60% felt satisfied with its resources and support. However, there were noticeable differences by major. For example, among business and engineering majors, approximately 64% indicated satisfaction with career resources and support compared to approximately half of English and Fine Arts majors.

But graduating seniors do not only seek job search support; a growing percentage also require personal support and counseling. In 2014, approximately one-third of seniors sought personal counseling in the past year, with significantly higher rates for those also reporting feeling depressed. Such responses reveal students’ need for personal and emotional support well beyond their introduction to college life.

The class of 2014 entered college with noticeably lower self-rated emotional health levels compared to previous entering classes. Among students who completed both the 2010 CIRP Freshman Survey and the 2014 CSS, those who began college reporting lower emotional health indicated lower levels of satisfaction with their overall college experience, felt less valued at the institution, and reported lower levels of belonging as college seniors.

Data not included in the CSS brief but available in the national dataset cover many other aspects of students’ college life. For example, 41% of students attended a racial/cultural awareness workshop while 48% played intramural sports since entering college. In addition, the complete dataset reveals differences between female and male students. For instance, 59% of females reported holding leadership positions in college compared to 62% of their male peers. When asked whether they had challenged a professor’s ideas in class, 6.5% of female students reported doing so frequently compared to 12.0% of male students; 48.4% of female students reported not doing so at all compared to 33.1% of male students. However, 73.5% of female students and 70.0% of male students reported frequently contributing to class discussions. Institutional data such as these provide campuses with greater insights into students’ involvement, satisfaction, and success.

For more results from the 2014 CSS, check out the research brief. (You can also see a full set of the national results here.) And don’t forget to register for the 2015 CSS Survey to better understand the college experiences and future plans of graduating seniors on your campus! The CSS survey can be administered on your campus through June 26th, 2015.

YFCY Brief with Results from the 2014 Survey is Released!

Posted by Jen Berdan Lozano on December 2nd, 2014 in News, News Homepage, Surveys | Comments Off

Findings from the 2014 Your First College Year (YFCY) Survey are discussed in our latest research brief; they are also highlighted in the recently released 2014 YFCY Infographic. This survey was administered to over 10,000 students this last spring at the end of their first year of college. Among the results are first-year students’ struggles with adjusting to college, communication with faculty, engagement with active and collaborative learning, civic awareness and engagement, and their early preparation for post-graduation.

The YFCY Survey measures multiples aspects of college adjustment and the results showed that time management was the most common hurdle for first-year students. Almost half (47%) of all students responded that they found managing their time either “very” or “somewhat” difficult; Latina/o students were the most likely to struggle, with 57% stating difficulties with managing their time. Most students (86%) stated that they have approached their professors for advice after class, and relatedly, understanding what their professors expect from them academically was the least of their troubles.

Mirroring our recent HERI Faculty Survey results where roughly half of faculty reported using online discussion boards in their classes, just over half (54%) of first-year students stated that they have posted on a course-related online discussion board. In-person dialogues continue to be the norm however with almost all students reportedly engaging in in-class discussions (96%) and talking about their course materials with other students outside of class (96%).

By linking the 2014 YFCY responses to the 2013 CIRP Freshman Survey responses we can measure students’ growth and change during their first year of college. These results highlight gains in students’ critical thinking and problem-solving skills. For example, over half of the students who entered college reporting their critical thinking and problem-solving skills as “average” now rate them as “somewhat strong” or a “major strength.” Continued development of these skills throughout college is important as both graduates and employers consider them essential on the job market.

Other results from the 2014 YFCY Survey not discussed in the brief include first-year students’ satisfaction with campus services and community. Overall, first-year students report being satisfied with their college experience. More than three quarters (78%) are “satisfied” or “very satisfied” with their overall academic experience. Specifically, students expressed satisfaction with their general education and core curriculum classes (72%), class size (74%), quality of instruction (75%), and academic advising (67%). Additionally, almost two-thirds (65%) were satisfied with the sense of community among students on campus. And with financial aid continuing to play a major role for many college students, the YFCY results showed that almost half of students (44%) utilized financial aid advising on campus. However, among these students, less than two-thirds (62%) reported being “satisfied” or “very satisfied” with their experience in the financial aid office on their campus, evidencing a greater need for support for students with financial needs on campus.

For more results from the 2014 YFCY, check out the research brief. (You can also see a full set of the national results here.)And don’t forget to register for the 2015 YFCY Survey to understand the experiences and adjustment of first-year students on your campus! YFCY survey can be administered on your campus between March 1st and June 12th, 2015.

CIRP-Related Presentations at ASHE

Posted by Kevin Eagan on November 20th, 2014 in News, News Homepage, Research, Surveys | No Comments »

If you’re attending the annual conference of the Association for the Study of Higher Education (ASHE) in Washington DC over the next few days, you might want to check out one or more of the following sessions that highlight findings from CIRP data:

  • Linda Sax and her team present work funded by the National Science Foundation that focuses on gender issues in science, technology, engineering, and mathematics (STEM). This particular presentation highlights the changing salience of gender in understanding students’ interest in pursuing engineering majors using four decades (1971-2011) of CIRP Freshman Survey data. (Thursday, November 20, 12:45 p.m. – 2 p.m., Georgetown East).
  •  

  • HERI Director Sylvia Hurtado, CIRP Director Kevin Eagan, and HERI Graduate Student Researcher Bryce Hughes report findings from a study focused on introductory STEM classrooms. Using data collected from 79 classrooms at 15 different campuses, the study examines the student and faculty factors that contribute to a greater sense of a competitive dynamic among students in introductory STEM courses. (Thursday, November 20, 2:15-3 p.m., Lincoln East).
  •  

  • Christos Korgan – a current graduate student in UCLA’s higher education and organizational change (HEOC) program – and Nathan Durdella, an alumni of the HEOC program) highlight findings from the 2010-11 HERI Faculty Survey. Their study examines the connection between student-faculty interaction and commitment to developing habits of mind for lifelong learning among part-time faculty. (Thursday, November 20, 2:15-3 p.m., Jefferson West).
  •  

  • UCLA HEOC alumna Marcia Fuentes presents her dissertation work that analyzed CIRP Freshman Survey and College Senior Survey data. The study focuses on issues pertaining to campus climate and connects those concerns with the broad policy discussions pertaining to affirmative action. (Thursday, November 20, 3:45-5 p.m., Monroe).
  •  

  • Nick Bowman joins HEOC alumnae Julie Park and Nida Denson in a presentation that uses CIRP Freshman Survey, College Senior Survey, and a six-year-post-college survey administered by CIRP in 2004 to understand college graduates’ civic outcomes. (Friday, November 21, 1:15 p.m.-2:30 p.m., Georgetown East).
  •  

This flyer has several other studies being presented at ASHE that analyze CIRP data. And don’t forget to join HERI and the Graduate School of Education and Information Studies at the UCLA Reception – Friday, November 21 from 6:30 to 8:30 p.m. in the Morgan Room.

2015 Your First College Year Survey Registration is Open, Paper is Back, and the 2014 YFCY Infographic is Released!

Posted by Ellen Stolzenberg on November 6th, 2014 in News, News Homepage, Uncategorized | No Comments »

Registration for the 2015 Your First College Year Survey (YFCY) is officially open! You can register now for the YFCY here and administer the survey any time between March 1 and June 12, 2015. We are happy to report that the paper administration is back! Join us in documenting the importance of the first year of college while getting valuable information about your students. If you click the registration link, you will also notice the first phase (registration and ordering) of our new portal for the follow-up surveys. Campus representatives will use this portal to register for the survey, track survey responses, download preliminary data, and retrieve final reports.

I wanted to highlight a few important changes to the 2015 YFCY. After each survey administration, we ask campus representatives about items they’d like to see on future instruments and we incorporate this feedback into the survey redesign process whenever possible. For the first time in its history, the YFCY includes items related to sexual orientation and gender identity – using the same items that have been on the Diverse Learning Environments (DLE) Survey for the past several administrations. Additionally, we have provided a more granular approach to asking about students’ race/ethnicity; we now provide four categories for Asian students: East Asian (e.g., Chinese, Japanese, Korean, Taiwanese); Southeast Asian (e.g., Cambodian, Vietnamese, Hmong, Filipino); South Asian (e.g., Indian, Pakistani, Nepalese, Sri Lankan); and Other Asian. We also added a question asking respondents whether they identify as multiracial, similar to the DLE.

In addition, we decided to update the response options for the financial aid question to reflect the steady increase in college costs. In prior years, “$10,000 or more” was the highest option students could select for each of the different types of financial support (loans, family resources, etc.) used to cover educational expenses. The new response options provide campuses the opportunity to continue to trend the data (by combining different categories) while updating the total dollar amounts to reflect the following categories: None; $1-2,999; $3,000-5,999; $6,000-9,999; $10,000-$14,999; and $15,000+.

Other modifications include changing the wording for the self-rated abilities question and response options to incorporate students’ views on the role the institution has played in their abilities. The question in previous years read: “Think about your current abilities and tell us how strong or weak you believe you are in each of the following areas” and it now reads “This institution has contributed to my…” The response options were updated to reflect students’ agreement or disagreement with this statement (Strongly Agree, Agree, Disagree, Strongly Disagree). Finally, in keeping with the Association of American Colleges & Universities Essential Learning Outcomes, “General Knowledge” within this question was updated to “Intellectual and practical skills (including inquiry and analysis, critical thinking, and information literacy).”

The 2014 YFCY “A Year of Change” infographic is now available! The poster highlights some of our findings from the 2014 YFCY. The infographic provides an understanding of the first-year mindset, including how the adjustment to college affects first-year students. For institutions participating in the YFCY survey, we also provide a customizable version of the infographic, allowing them to easily compare their own data to the national results.

*****
Is your institution currently posting findings from any of the CIRP surveys on your website? We’d love to see these pages! Please share your link by emailing me at stolzenberg@gseis.ucla.edu.

If you would like to share information on your website but aren’t exactly sure what to include, I’d be happy to help. You can reach me directly at 310.825.6991 or the email address above.

Same-Sex Marriage Support Nearly Universal Among Entering College Students

Posted by Kevin Eagan on October 27th, 2014 in Conferences, News, News Homepage | No Comments »

This blog is cross-posted at The Huffington Post:

The national landscape for marriage equality has changed considerably in the past month. On Oct. 6, the U.S. Supreme Court declined to hear appeals on five different cases challenging lower courts’ rulings that found same-sex marriage bans to be unconstitutional. The decision paved the way for same-sex marriage in five states immediately (Oklahoma, Virginia, Utah, Wisconsin, and Indiana). Just a few days later, Idaho and Nevada joined the growing number of states allowing same-sex marriage. On Oct. 17, same-sex marriage bans in Alaska and Arizona fell with Wyoming following suit just days later.

Ted Olson, one of the lawyers in the landmark “Proposition 8″ Supreme Court decision (Hollingsworth v. Perry), declared today that the “point of no return” on gay marriage has now passed. Indeed, it seems clear that the U.S. Supreme Court decision is signaling to the lower courts that it will not take up the issue of same-sex marriage any time soon, particularly if the lower courts continue striking down state marriage bans for same-sex couples.

As these state bans continue to fall, the federal government has announced that it would immediately begin recognizing same-sex marriages in all of 33 states. This decision follows the U.S. Supreme Court decision on the Defense of Marriage Act (DOMA) in 2013 (United States v. Windsor), which held that denying benefits to married same-sex couples was unconstitutional.

It is hard to believe that Congress enacted DOMA less than two decades ago. Right after that law went into effect, the Cooperative Institutional Research Program (CIRP) Freshman Survey at UCLA’s Higher Education Research Institute began asking incoming freshmen their views on same-sex marriage. Since CIRP first started asking the question in 1997, a majority of incoming college students have agreed that same-sex couples have a legal right to marry; however, it is remarkable how strongly incoming students now endorse this position. The CIRP Freshman Survey last asked this question in 2012, and three-quarters of first-time, full-time students (75.1 percent) agreed that same-sex couples have a legal right to marry, and the data suggest that nearly all (91.1 percent) of students who identify as “liberal” or “far left” hold this view.

2014-10-27-SameSexMarriageSupport.png

Support of same-sex marriage among “conservative” and “far right” students has increased more than 20 percentage points since the question first appeared on the CIRP Freshman Survey. A near majority (46.4 percent) of students who identify their political ideology as “conservative” or “far right” now agree that same-sex couples should be allowed to legally marry.

The largest gains in support of same-sex marriage have been among incoming students who identify their political ideology as “middle-of-the-road.” In 1997, a bare majority (51.5 percent) believed same-sex couples should be permitted to marry. By 2008, more than two-thirds (67.7 percent) felt similarly, and that figure jumped another 10 percentages points by 2012 with 78.9 percent of “middle-of-the-road” students supporting same-sex marriage.

Today’s college students do not just support same-sex marriage; they also support allowing gay and lesbian couples to adopt. In 2013, 83.3 percent of all first-time, full-time college students agreed that gays and lesbians should have the legal right to adopt children.

Most individuals are more than mere single-issue voters, but given these numbers, it is interesting that some politicians continue to focus so heavily on social issues like same-sex marriage. The recent spate of court decisions in favor of same-sex marriage in the past two years, and particularly in the past four weeks, has caught up with public opinion. The political views of today’s college students increasingly suggest growing divide with the “culture wars” being waged by social conservatives. Candidates running for political office who continue to emphasize social questions while doing everything in their power to impede progress on an issue such as gay marriage risk alienating this large bloc of potential voters.

The question regarding support of same-sex marriage appeared again on the 2014 CIRP Freshman Survey, and we expect to see even greater support for the issue. The 2015 Freshman Survey likely will be the last time the item appears, as the data make clear that support for same-sex marriage is nearly universal among entering college students.

2014-15 College Senior Survey Registration Is Open, and Paper is Back!

Posted by Kevin Eagan on October 24th, 2014 in News, News Homepage, Uncategorized | No Comments »

Earlier this week we opened registration for the 2014-15 College Senior Survey (CSS), and we are thrilled to have re-introduced the option for a paper administration of the survey. Campuses that wish to learn more about their December graduates can register now and begin administering the survey on November 14. If you click the registration link, you will also notice the first phase of our new registration portal for the follow-up surveys. Campus representatives can use this portal to register for the survey, track survey responses, download preliminary data, and retrieve final reports.

I wanted to highlight a few important changes to the 2014-15 CSS. In our 2013-14 CSS Administration Report Form, we asked campus representatives about items they’d like to see on future CSS instruments, and we listened. For the first time in its history, the CIRP CSS includes items related to sexual orientation and gender identity – using the same items that have been on the Diverse Learning Environments (DLE) survey for the past several administrations. Additionally, we have provided a more granular approach to asking about students’ race/ethnicity; we now provide four categories for Asian students: East Asian (e.g., Chinese, Japanese, Korean, Taiwanese); Southeast Asian (e.g., Cambodian, Vietnamese, Hmong, Filipino); South Asian (e.g., Indian, Pakistani, Nepalese, Sri Lankan); and Other Asian. We also added a question asking respondents whether they identify as multiracial, similar to the DLE.

In looking at the past several years of data, we determined it was time to update the response options available in the financial aid question. The past few administrations showed that a substantial chunk of students used more than $10,000 (formerly the highest option) worth of loans or financial aid that did not need to be repaid (e.g., grants, scholarships) to cover educational expenses. The new response options provide campuses the opportunity to continue to trend the data (by combining different categories) while updating the total dollar amounts to reflect the following categories: None; $1-2,999; $3,000-5,999; $6,000-9,999; $10,000-$14,999; and $15,000+.

Another important change focused on the self-rated ability items – those pertaining to general knowledge, critical thinking skills, knowledge of a particular field or discipline, etc. Rather than ask students to rate their abilities in these areas (nationally we saw 85% of students rating themselves as strong/a major strength across the board), respondents will now assess whether they believe their undergraduate experience contributed to these abilities. Specifically, the question reads: “Please indicate your agreement with each of the following statements. This institution has contributed to my: Knowledge of a particular field or discipline; Interpersonal skills; Foreign language ability; Critical thinking skills, etc.” The response options are on an Agree Strongly to Disagree Strongly scale.

In a time where greater focus is being given to students’ experience with research during their undergraduate careers, we have added a question about the amount of time students spent doing research in college. The question reads: “How many months since entering college (including summer) did you work on a professor’s research project.” Response options range from 0 months to 25+ months. Ongoing research at HERI continues to point to the value of undergraduate research in facilitating students transition to graduate school – particularly among students in science, technology, engineering, and mathematics (STEM) majors.

We also learned that students found our new way of asking about future plans to be confusing, and we have streamlined this question. Rather than asking about primary and secondary activities, we ask students (yes/no) about their intentions to work full-time, work part-time, attend graduate school, volunteer, etc., in the fall following their completion of their undergraduate degree.

One last note that we hope everyone can appreciate: The survey is 10% shorter. We eliminated redundancies and took to heart feedback from campus representatives regarding items they found to be least helpful in their assessment efforts. We hope these changes make the CSS more inclusive, more useful, and more timely for our campuses. Registration is now open; administration begins November 14 and continues through June 26, 2015.

Marketing Campus Surveys ‐ Do More than Email!

Posted by Kevin Eagan on October 22nd, 2014 in News, News Homepage, Surveys | No Comments »

When conducting surveys on campus, researchers must give careful consideration to how they market and brand the survey. Given the trend in the past decade (or more!) of over-surveying students, faculty, and staff, finding a way to make your study stand out can be a challenge.

Our September Poll of the Month asked respondents about how they go about marketing surveys on campus. In all, we had 57 respondents to the poll. By far personalized emails were the most popular form of reaching out to students, faculty, and staff to alert them to the survey: 75% of respondents used this method. Personalized emails at minimum include a unique survey link to ‘track’ respondents as well as an introduction that addresses the respondent by name. These types of communication send the message that the individual will not continue to be bombarded with requests for this particular survey if s/he responds now or opts out of the panel.

Personalized emails also offer an opportunity for the person overseeing the administration to try to appeal in specific ways to the potential respondent by connecting the instrument to their major/department, club/group, or course. All of the CIRP surveys offer campuses with the option of contacting potential respondents through HERI-managed emails – this service tracks respondents and those who opt out and removes them from the panel.

Nearly as many respondents (70%) reported using emails sent to listservs. This form of outreach may be an easier and broader strategy to connect with the targeted audience, but it comes across as less personal and can limit researchers’ ability to track individuals who have responded. Thus, sending email blasts to listservs increases the risk of survey (and email) fatigue, particularly among individuals who have already responded.

Nearly half of respondents to the September poll (46%) use announcements in class to advertise surveys. This kind of outreach can add a personal touch to requests to participate in surveys that a faceless email cannot provide. Additionally, class announcements can be particularly effective when targeting a specific group of students. For example, using first-year seminar courses or introductory English courses might be a good strategy to use for a survey like CIRP’s Your First College Year survey, which focuses on experiences and outcomes of first-year college students. Relatedly, about a quarter (23%) of respondents to the poll reported making announcements at meetings, which help add that personal touch when surveying faculty or staff.

More than one-third of respondents (37%) rely upon flyers around campus, 19% post ads in the campus newspaper, and one in nine (11%) advertise surveys on billboards or marquees around campus. In our annual administration report form (ARF) that we send at the close of every survey, we have learned of some campuses advertising their CIRP surveys on busses – that’s one surefire why to make sure word gets around!

Others outreach to potential respondents through the residence hall staff (9%) or by signaling to students and faculty that the campus uses the results by highlighting findings through infographics (9%). We occasionally hear of campuses incentivizing resident assistants (RAs) with pizza parties awarded to the floor or area with the highest response rate. Additionally, showcasing the findings from previous surveys through a medium such as infographics signals to potential respondents that their input matters and gets used and seen by the institution.

When trying to conduct a campus-wide survey, having an effective marketing strategy will go a long way in promoting greater interest and response among targeted participants. So, for your next survey administration, consider marketing the study beyond just email – post some flyers, partner with campus housing, or print out some infographics to show your target audience just how much their input matters.

It is travel season! Meet staff at conferences around the country and learn about CIRP and other current HERI research.

Posted by Ellen Stolzenberg on October 15th, 2014 in News, News Homepage, Uncategorized | Comments Off

While football season is well under way, it is time for conference season to begin!

Part of our mission is to produce and disseminate original research. We also help colleges and universities in using both our research findings and their own institutional data to foster institutional understanding and improvement. We will be presenting and exhibiting at several national and regional conferences over the next few months. If you are attending any of the conferences, please come by the booth or our presentation(s) to say hi. Please contact me if you would like to set up a specific time to meet (stolzenberg@gseis.ucla.edu).

IUPUI Assessment Institute
Indianapolis, IN
October 19-21, 2014
Indianapolis, IN
Calculating Persistence: Using Student Demographic and Experiential Backgrounds to Estimate First-Year Retention (Monday, October 20th, 4:30-5:30 PM)

AACRAO Strategic Enrollment Management Conference
Los Angeles, CA
October 26-28, 2014
Estimating First-Year Retention: Tools for Enrollment Management (Tuesday, October 28th, 2:15 PM)

California Association for Institutional Research (CAIR)
San Diego, CA
November 19-21, 2014
Conference Sponsor (Come meet Dominique and Ellen at the booth.)
From Administration to Z-Scores: An Overview of Survey Research (Friday, November 21st, 10:00-10:45 AM)

Association for the Study of Higher Education (ASHE)
Washington, DC
November 20-22, 2014
Stay tuned for presentation information!

Southern Association of Colleges and Schools Committee on Colleges (SACSCOC) Annual Meeting
Nashville, TN
December 6-9, 2014
Exhibitor (Ellen will be at the booth)

Assessing the Pervasiveness of Sexual Assault on College Campuses: The Importance of Data

Posted by Kevin Eagan on October 13th, 2014 in News, News Homepage, Research, Surveys | Comments Off

The past year has brought intense scrutiny of colleges and universities over the issue of sexual assault on campus, particularly since the White House issued a report last January that included an alarming statistic: one in five women are sexually assaulted while in college. As of August, the U.S. Department of Education is investigating more than 75 colleges and universities for their handling of sexual assault allegations.

This attention has prompted draft legislation and newly enacted laws to combat the reported prevalence of sexual assault on college campuses. On September 28, California Governor Jerry Brown signed a bill requiring public colleges and universities in the state to revise their rape-prevention policies to include an “affirmative consent” standard, which requires both parties engaging in sexual activity to verbally consent – passive (silent) assent is unacceptable under the new statute. In September, the Obama administration, supported by the NCAA and a number of media companies, unveiled the “It’s On Us” campaign aimed at combating sexual violence on college campuses.

New York Senator Kirsten Gillibrand and Missouri Senator Claire McCaskill continue to lead efforts on Capitol Hill to enact legislation designed to promote greater institutional accountability for handling of sexual assault allegations on campus. The draft bill would require annual anonymous surveys of all college students to provide a more complete picture of the prevalence of sexual violence on campus. The Campus SaVE (Sexual Violence Elimination) Act, passed in 2013, went into effect last week; one of the provisions of this new law requires campuses to maintain greater transparency and accountability about the reporting and handling of sexual assault allegations. Additionally, the new law includes language pertaining to same-sex sexual assault – an important step forward from previous versions of this law.

The increased media attention and new regulations raining down on institutions from local, state, and federal policymakers necessitate that colleges and universities better realize the prevalence of the issue on their campuses. Many instances of sexual assault go unreported, and it is imperative that institutional leaders improve their understanding of the broader climate on their campus.

Surveys are a key tool to explore a number of aspects of climate on college campuses. For the past five years, the Higher Education Research Institute has provided such a tool in the Diverse Learning Environments (DLE) survey. In response to the growing national conversation about sexual assault on college campuses, we have added a set of questions to the DLE that ask respondents about their experiences with unwanted sexual contact since entering their current institution. Students who report having had unwanted sexual contact see a short set of questions pertaining to the incident(s) – whether perpetrator used physical force, whether the survivor was incapacitated when the incident happened, and whether or to whom the survivor has reported the incident.

Our mission at the Higher Education Research Institute has always consisted of two parts: “to inform educational policy and promote institutional improvement through an increased understanding of higher education and its impact on college students.” These important changes to the 2014-15 DLE survey accomplish both tenets of this mission. First, we aim to provide colleges and universities with actionable information about the prevalence of sexual assault on campus. Some institutions may not be quite ready to see this information, but they would be better served to squarely address this issue before additional regulations (or sanctions) are imposed on them by policymakers.

Second, we aim to contribute to the important conversation about sexual assault on college campuses. The widely cited statistic that 20% of women experience sexual assault while in college is based on a study of two public universities; thus, although the government and media have latched on to this figure, we really do not know how representative this statistic is across institutional types, geographic regions, or student characteristics (indeed, some media have been skeptical of these numbers). The point of the Cooperative Institutional Research Program, a national longitudinal study of the American higher education system, is to aggregate data across institutions to provide, if not national, at least multi-institutional perspectives about the college experience. We hope this important addition to a survey that already focuses on other campus climate issues can further advance our objective.

"As Reauthorization Turns": Data and the Reauthorization of the Higher Education Act

Posted by Lesley McBain on September 23rd, 2014 in News, News Homepage | Comments Off

The process of reauthorizing the Higher Education Act (HEA) has once again begun in Washington, DC. While this may seem remote from survey administration and analysis, current draft bills circulating to reauthorize HEA pertain to campus-level student data in highly specific ways.

The Republican and Democratic approaches to reauthorizing HEA differ: the Republicans have offered thus far a white paper on their priorities and three separate bills on different aspects of HEA. The white paper focuses on accountability and informed decision-making by “the consumer” and calls for reforming federal education data collection to achieve this goal: “For example, information collected by IPEDS must be improved and the delivery of information streamlined to reduce confusion….The IPEDS data collection must be updated so it captures more than first-time, full-time students” (pp. 2-3).

The bill most pertinent to institutional and survey researchers is H.R. 4983, introduced by Representative Virginia Foxx (R-SC). As of July 24, the bill has passed the House of Representatives and been referred to the Senate. H.R. 4983 eliminates certain data elements from the federal College Navigator website that were signatures of the last HEA reauthorization—the college affordability and transparency lists (aka “watch list”), the state higher education spending charts, and multiyear tuition calculator—but also revamps College Navigator in many ways. These include adding links to Bureau of Labor Statistics (BLS) data on both regional and national starting salaries “in all major occupations” and links to “more detailed and disaggregated information” on institutions’ “faculty and student enrollment, completion rates, costs, and use and repayment of financial aid” (Congressional Research Service [CRS] bill summary). Detailed minimum requirements would also be imposed on the net-price calculator currently in operation, including distinguishing veterans education benefits from other financial aid and providing links to information on those benefits if the calculator does not estimate their eligibility (Congressional Research Service [CRS] bill summary).

The contrasting Democratic approach has been Senator Tom Harkin (D-IA)’s release of a draft omnibus bill reauthorizing all of HEA. Specific sections readers may find most interesting—though reading the entire 785-page bill is useful for contextualizing data issues—are Title I, Secs. 108, 109, and 113, as well as Title IX, Sec. 902. These sections relate to College Navigator and the College Scorecard (Title I, Secs. 108 & 109), a new complaint resolution and tracking system (Title I, Sec. 113), and a new data center tracking data on students with disabilities (Title IX, Sec. 901). In addition, joint letter from the American Council on Education and 20 other higher education associations suggests that a unit-record database provision will be reinserted into this draft.

The Democratic draft bill adds data to College Navigator such as “the number of returning faculty (by full-time and part-time status, tenure status, and contract length)” (Sec. 108). In addition, it majorly expands the College Scorecard to include items such as average net price broken out by enrolled students’ family income with a comparison to other institutions, completion percentages for all undergraduate certificate- or degree-seeking students, both term-to-term and year-to-year persistence percentages for all undergraduate certificate- or degree-seeking students, the percentage of students who transfer to four-year institutions from two-year institutions within 100% and 150% of normal time; comparisons to other institutions is required for all points. Institutions will be required to make their most recent College Scorecard publicly available on their website, distribute it to prospective and accepted students regardless of whether the information was requested “in a manner that allows for the student or the family of the student to take such information into account before applying or enrolling” (Sec. 133(i)(2)(B), p. 51).

The new complaint resolution and tracking system (Title I, Sec. 113) would create a new federal complaint system—and office within the Department of Education—to track and respond to complaints from students, family members of students, third parties acting on behalf of students, and staff members or employees of institutions against higher education institutions receiving funds authorized under HEA. Complaints are defined as “complaints or inquiries regarding the educational practices and services, and recruiting and marketing practices, of all postsecondary educational institutions” (Sec. 161(b)(1), p. 60). Institutions would be required to respond to the Secretary of Education no later than 60 days including what steps they have taken to respond, all responses received by the complainant, and any additional actions taken or planned in response (Sec. 161(c)(2), pp. 60- 62). The Department of Education will publish the number and type of complaints and inquiries received as well as information about their resolution; it will also be required to submit a report to authorizing committees on type and number of complaints, to what extent they have been resolved, patterns of complaint in any given sector of postsecondary institutions, legislative recommendations from the Secretary to “better assist students and families,” and the schools with the highest complaint volume as determined by the Secretary (Sec. 161(4)(d)(A-E), pp. 65-66).

The creation of the National Data Center on Higher Education and Disability (Title IX, Sec. 902) will require institutions participating in Title IV financial aid programs submit information on their programs and services for students with disabilities, including “individual-level, de-identified data describing services and accommodations provided to students with disabilities, as well as the retention and graduation rates of students with disabilities who sought disability services and accommodations from the institution of higher education” (Sec. 903(b)(6), p. 611) and “shall collect, organize, and submit such data in a way that supports disaggregation” (Sec. 903(c), p. 611) by 13 specified disability categories (Sec. 903(a), p. 610). This data will be made available to the public.

As always, the road to HEA reauthorization is long and winding. While Senator Harkin recently mentioned trying to move forward on HEA during the lame duck session of Congress, it is by no means certain. The issues of data collection raised by draft provisions on both the Republican and Democratic sides, however, are crucial both in research and practice. What data are most useful to parents, students, and other higher education stakeholders who need data to better serve their student and institutional populations? Are those even the same data points? What purpose—or how many purposes—should federal postsecondary education data collection serve? How much data collection is too much data collection? What privacy issues are raised by proposed HEA provisions? Is federal legislation even the place to answer these questions?

The best advice we can offer is to stay tuned and stay informed. Researchers and survey administrators/analysts have not only a critical stake in the outcome, but can provide an informed perspective to the various parties involved in Reauthorization.