Our Sector: An NSS Masterclass

After delivering a number of individual sessions on interpretation and use of the National Student Survey (NSS) Jon Berg and Daniel Palmer ran a joint session at SU2015. The presentation was based on Daniel’s ‘NSS Q23 Masterclass’ with some additions from Jon’s work. The session was well attended both as a general introduction to NSS and a more discursive format in the CEO stream and this article summarises the key takeaways.

In approaching the NSS it’s important to remember that it’s not about chasing numbers, but is about trying to obtain new knowledge about your students, your SU and the University. The NSS covers key aspects of learning, teaching, and assessment at every HE institution across the UK. If the NSS didn’t exist, SU’s would need to invent it as it brings credibility in its independence, high participation rates and depth. The accumulated data available to SU’s at their institution can provoke further investigation, highlight good / bad performance and help SU’s find routes into understanding and help improving the student experience.

Where does a good score come from?

Our first point is that a good SU satisfaction score on question 23 of the NSS will come from work on the LTA covered by questions 1 to 22. The average final year student will be scoring the SU in the context of the previous 22 questions i.e. a survey on learning teaching and assessment, rather than a a student experience or lifestyle survey.

Analysing Scores

At our SU2015 session we focussed on question 23, looking at the changes in scoring over the first three years and the factors that can improve an SU’s score on Q23. The average SU score was 68% in NSS 2014, whereas the average institution satisfaction was 86%. Most of the difference is the neutral 3’s (neither satisfied nor dissatisfied), where the SU average is over 20%, or three times the university average. Some of those students may be unaware of the SU, or lack personal experience and specific examples of the SU’s work in the context of NSS. The 1’s and 2’s (Definitely / mostly dissatisfied) is over 10% for UKL SU’s as a whole, whereas it averages significantly lower for institutions. We suggest two courses of action:

  • Dissatisfaction over 10%? – review the NSS text responses about the SU and look at the courses, departments, sites, and demographics where the dissatisfaction score is highest and target remedial work.
  • Neutral scores over 20% – analyse scoring by course, department, site, demographics and use this to plan student engagement work including communication about the SU and face-to-face contact through inductions/lecture shouts and outreach visits like ‘SU on Tour.’

Remember, some interventions will not work but concentrating effort on 10 % of courses will yield positive results in some or all of them. We would recommend general SU promotion and outreach continues, but where possible these should be targeted to segmented groups. Our experience is that increased engagement lowers 3’s and increases the 5’s, and shows the SU doing a good job listening to students and acting on issues. In the masterclass session, Daniel shared an analysis of SU engagement with students from different academic Schools at Cardiff University and how it compares to NSS scores. In short, there is a statistical link between engagement in SU activities and NSS satisfaction (kind of obvious, really?). Please read the section in the PDF below to see the methodology used, it may save your students’ union a lot of initial planning.

Value for money

We also discussed value for money and floated a measure that SU’s could use with funders: Cost per Satisfied Student. This was defined as the total of recurrent grant funding divided by the number of satisfied students (= student numbers x Q23 score)

Summary- Your Questions

We ended with a summary of the questions and contributions from the floor:

  • Is location a factor? Yes. The studies done by various SU’s show the SU location relative to learning and teaching buildings suggests that 3’s increase with distance. Satellite campuses/sites bring challenges with resourcing and lower SU profile, HE in FE or college franchise goes with this.
  • Will losing services affect our NSS? Not necessarily. There are examples where consultation of closures and changes has helped. But there are cases where a sudden change or service withdrawal has hit scores.
  • Will question 23 be removed from the NSS? Best to ask NUS or HEFCE. We understand that question 23 could disappear from the compulsory questions after NSS 2016.
  • Our NSS score is not what we expected from our own research or from the Times Higher Student Experience results? This research is cultural i.e. it is done in a context and an internal survey will be seen differently to an independent national survey. The response rate and how responders are selected are also factors. Comparisons between league tables suggest that the highest scoring SU’s do well in both the NSS Q23 and the THE SES. The remaining unions can appear in very different places in the two tables.
  • Should NSS question 23 be in the CEO’s objectives? It is for many of the leading SU’s, but as part of a balanced set of measures of success and change priorities. We would recommend that it is, like any other KPI.
  • Our institution has commented on the SU being the lowest performing score in the NSS, should we be concerned? Yes, but it is understandable from their point of view as they supply funding. We suggest starting a dialogue about the factors affecting the SU’s NSS score and agreeing a plan to work together to improve it.

We enjoyed the opportunity to discuss the NSS and Q23 with our colleagues and hope you will find the presentation helpful.

Jon and Daniel’s Masterclass PDF with slides, graphs, insights and more is available here

Posted in Our sector.

Leave a Reply

Your email address will not be published. Required fields are marked *