NSS research and developments

23 Sept 2020

The Office for Students (OfS) detail its arrangements for a review of the NSS; it will be a two-stage internal review process.  The first stage will address the concerns raised in the DfE, BEIS, NIHR policy document and will report later this year. The second stage will look more widely at the role of the NSS, including which questions should be asked to support regulation and student information across all four countries of the UK.

The contract for the 2021 NSS has already been awarded. The OfS decided that, pending the outcome of the review, the perspectives of students continue to be required to inform its regulatory work. This is particularly the case next year given the impact of the coronavirus pandemic. It is not possible to test and pilot any changes to the NSS that might result from the review, to ensure its integrity and statistical validity, in time to capture the experience of students during 2020-21.

However, given the significance of the review, the board agreed on a number of measures to amend the 2021 NSS. First, any decision on what to publish from the NSS and at what level should await the outcome of the review to ensure that the 2021 published results were aligned with the new direction of travel resulting from the review. It also agreed that the burden on providers should be reduced in the 2021 NSS, by no longer requiring them to promote the survey internally to their students.

10 Sept 2020

The DfEBEIS and NIHR published a policy document Reducing bureaucratic burden in research, innovation and higher education” which includes a call for radical, root and branch review of the NSS to be completed by the OfS by the end of this 2020.  

Feb 2020

As part of the NSS 2020 the OfS are running the first stage of a pilot looking at the feasibility of expanding the NSS to students on one year (full-time equivalent) courses. These students have previously been excluded from the survey population. This forms part of a wider programme of exploratory work designed to assess the feasibility of including students across all years of study in the survey.

2018

The Office for Students have evaluated NSS 2017 to see how the new survey worked in practice and if there were any issues; see the report detailing the findings from the student and institutional online surveys.  A wider evaluation of the new NSS is planned for autumn 2018 when institutions will be able to assess the impact of the changes made since 2017.

2017

In 2017 a new National Student Survey (NSS) questionnaire was introduced following a two year research and testing programme. This was followed by changes to the optional banks of questions for 2018.

September 2016

HEFCE announces substantial changes to the NSS for 2017, based on outcomes of the funding bodies’ Review of Information consultation (www.hefce.ac.uk/pubs/Year/2016/201615/) and testing programme. These are the first major changes to the survey since its introduction in 2005. Changes include nine new questions on student engagement, updated questions on assessment and feedback and learning resources, removal and transfer of personal development questions to the optional question banks, and removal of two duplicative questions to ensure the survey remains short.

November 2015

Funding bodies conducted a review of information about learning and teaching, and the student experience which included proposals for changes to the NSS for 2017. The consultation closed in early December and analysis of the consultation responses will be published in the spring 2016. 

July 2014

HEFCE published two reports:

May 2011

Overall satisfaction results to include sector adjusted benchmarks (this link also takes you to the benchmarked results for subsequent years too)

In future, sector-adjusted benchmarks will be released alongside the raw NSS results, with significance indicators. Since the inception of the NSS there has been concern that comparing institutions without taking account of the mix of students and subjects at the institution was misleading. In her initial analysis of the NSS Paula Surridge  said “Models for all students have shown that there are important differences between student groups, which suggest that comparisons based purely on the aggregate scores, both between institutions and between courses within institutions may be misleading if adjustments are not made for student profiles and course characteristics”. More recently the Institute of Education report for HEFCE  stated at recommendation five that the NSS results cannot be used responsibly to “compare whole institutions without taking account of sources of variation such as subject mix and student characteristics”.

The benchmarks developed are based on Paula Surridge’s work, which highlighted a number of factors that have a consistent and material effect on responses to question 22 and are largely outside the institution’s control. The factors are:
• Subject
• Ethnicity
• Age
• Mode of study
• Sex
• Disability

Underpinning HEFCEs approach to the recommendations of Surridge and the IoE are the following principles:
• The need to avoid simplistic comparisons of institutions that do not take into account subject mix and student characteristics
• The need to be consistent in the treatment of data at institutional level (i.e. to provide relevant benchmarks and significance indicators alongside the raw data)
• The importance of selecting relevant factors in the calculation of the benchmarks, i.e. those variables over which the institution has limited or no control

November 2010

Public information about higher education; Consultation on changes to information published by institutions

This is a joint consultation by HEFCE, Universities UK and GuildHE on proposals for: giving prospective students useful information about higher education courses; developing the National Student Survey; and improving accessibility to the information that higher education institutions publish about their courses and which is used for quality assurance.  Education Committee will be coordinating a response to this consultation for submission in spring 2011.

August 2010

The Higher Education Public Information Steering Group (HEPISG) commissioned two pieces of research to review the efficiency, effectiveness and use of existing public information about higher education

1. The first piece of research, 'Understanding the information needs of users of public information about higher education', was carried out by a team from Oakleigh Consulting Ltd and Staffordshire University. It considers what information is wanted and needed, the best modes of delivering the information, who should provide the information, and how the information would support potential students in making their choice of where to study.

The research identifies the key pieces of information that prospective and current students deem important as being primarily related to course satisfaction, employability and costs. It found that when students searched for this information they could generally find it. However, only a minority were found to actively search for such information. The report recommends that the way that students are made aware of information therefore needs to change, and the profile of the information should be raised, particularly at schools and colleges.

2. The second study by the Institute of Education, 'Enhancing and Developing the National Student Survey' considers how the National Student Survey (NSS) can be enhanced, including additional purposes for which the NSS should be used. The researchers conclude that the NSS should continue to support the purposes of providing information to assist student choice and of contributing to quality assurance and enhancement. They make suggestions for using the survey more effectively but recommend that the 'core' survey should not be lengthened.
Within this report it should be noted that in recommendation 5 that the NSS results cannot be used responsibly to 'compare whole institutions without taking account of sources of variation such as subject mix and student characteristics'. In responding to this recommendation 2010 will be the last year that HEFCE presents the whole institution results table. In the future we will publish data on Q22 alongside sector adjusted benchmarks which take into account the factors we know to affect the results.

HEPISG is considering what changes need to be made in the provision and use of public information about higher education, in the light of the recommendations in these reports. The group, which includes members from the NUS and employer groups, will make recommendations to the boards of HEFCE and Universities UK and the GuildHE Executive. A consultation will then be jointly published in late 2010, which will apply to higher education delivered in England and Northern Ireland only. Unistats and the NSS have UK-wide dimensions however, and funding and representative bodies in Scotland and Wales are also considering the research results with interest.

July 2008

Paula Surridge, from the University of Bristol, was commissioned to undertake a full analysis of the 2005 survey results, 2006 survey results and 2007 survey results: 'The National Student Survey 2005-2007: Findings and trends' was published in July 2008.