In this piece the CEO at York Uni SU, Ben Vulliamy, looks at the state of play with NSS and the wider move to metrics in the sector
Typical, isn’t it? Just as many Unions had worked out how to use the NSS to their advantage, the game changed. As Unions used the data to argue for funding growth, for capital investment into buildings and worked out which strap line to put on email headers, we were told that all bets are off. As the evidence grew of Unions using NSS data to identify demographics with a less satisfying student experience, lobbying our institutions to invest in (for example) disability support, the data set shifts. The NSS is changing- and we will need to change with it.
All of this coincides with the Teaching Excellence Framework and a belief from some quarters that the NSS is the critical tool in government and university’s leveraging increased fees. Cue much fervour on campuses about NSS boycotts, the Office for Students announcing the appointment of Michael “Mr Metrics” Barber and rumours of significant increases in NSS turnout.
In amongst the chaos (or perhaps contributing to it), a group of Unions set about to try to understand student views on the new NSS key lines of enquiry. The project wasn’t aimed at fuelling a well established fire nor were we naive enough to anticipate we would establish a magical formula that would enable us to max out on a newly worded question 26. The aim of the project was much more about understanding how we could make an impact on students’ academic experience- and even more importantly, establishing the link between ‘student satisfaction’, ‘student interest’ and ‘student voice’.
Across the pond
Late last year I spent 10 days in the US looking around eight different Universities and talking with an eclectic range of staff practitioners on student experience, students, academics, senior management and a few more in-between.
On my first night I was taken to the top floor of the offices of the President (vice chancellor) of the University of Massachusetts. His offices were a tower block in Central Boston, next to the Boston Common and overlooking the beautiful city scape of Boston. The 20 odd story building didn’t include teaching space or student accommodation. They were administration and management offices with the Presidents club house on the top floor. Given that I went in November shortly after the US presidential election its perhaps merely coincidence that made me think about how the offices were similar to look and feel of Trump towers.
This experience of lavishness was replicated at every visit I made thereafter. I went to the $180m new build union in Lowell University. I was talked through the free pre arrival program available at Tufts University and visited their Cultural Houses offering accommodation, kitchens and facilities to a wide range of liberation groups. I saw the most astonishing built environment of MIT, visited one of Harvards 70+ librarys, went to the University of Boston students union to see their theatre, 650 cover cafeteria, student art studio and not 1 but 2 giant Atriums. I turned on the local news in my hotel and heard first and foremost about the local University ice hockey team result. I met a staff member from Northeastern University who had just finished the student Union art show which showcased student art and sold it to raise over $1m in a week!
All of this comparable lavishness and excess was paid for by eye watering fees. The bank busting charges bore no connection to student satisfaction but solely to that which the market could generate. The people I met were commonly bemused by the notion of the UK using student satisfaction to influence league tables, recruitment and fees. They were confused about the role of course representatives. They felt it most amusing to have sabbatical officers in students unions (and even more so that they were trustees). It is noteworthy that because of the absence of independent student voice, I noticed how much more naturally inclined senior University management were to engage directly with students. I met one VC equivalent who lived in Halls of Residence and gave me examples of how doing so had led him to identify a range of (broadly domestic) factors which were inconvenient for students and which he subsequently changed. For example- implementing additional water fountains, relocating a bus stop to be more convenient and installing additional TV sets with separate receivers so students could watch different media feeds at the same time.
My point is that dramatic changes to fees and a hyper-marketization of UK HE could make our current set up look like a firmly closed shop. It is a real threat and can be achieved (if the key decision makers are so inclined) without any reference to student satisfaction, student voice and student interest. It is at least a plausible scenario that boycotting systems designed to monitor student satisfaction could play right into the hands of a solution for increased fees that makes no attempt to recognise student satisfaction as a feature of UK higher education. To minimise the risks, it follows that a long term NSS boycott based around its methodology or fee increase links (which may well be legitimate concerns) should probably be accompanied by Students’ Unions collaborating on an alternative vehicle to measure, monitor and articulate student satisfaction and the role of Students Unions. That vehicle will need to have similar credibility to the sector as the NSS currently offers. That feels like a very tall order and a lot harder than try to engage in evolving the current NSS model.
Our research was developed, commissioned and run in under three months. While in many respects the journey was problematic and clumsy the eventual result was that by Christmas we had 18 Unions participating and over 17,000 students contributing to the data. Its worth saying that this generated a bigger and more current data set of testing on the new NSS Q26 than HEFCE has at this stage. The use of an external research agency (Alterline) provides a credibility and independence that quality assures the data in the eyes of constituents.
In some ways the headline findings of the research in many respects offers few surprises. Firstly, the new Q26 is likely to deliver lower scores than the old Q23 for many unions. The students least likely to score unions well under Q26 are exactly the same as those we know give us lower scores under Q23 (distance learners, part time students, mature students, nursing students, male students, third years etc). Larger students unions (building, finances, advice centre staffing etc.) still tend to score higher than smaller unions- although interestingly, it’s less of a critical factor that it was under Q23.
Dig deeper in the research and we find a layer of data that perhaps validates things we might previously have speculated to be the case. Students commonly turn to the staff and academics that support their course before their union, but they also have a level of appreciation about their union advice function offering an emergency ‘final port of call’. Students value their course representatives conceptually but have only nominal levels of understanding about what they achieve and the impact they make. Students don’t (yet) have clarity of definition over the phrasing of ‘academic interests’ and offer different definitions of what this means when asked to define it without guidance.
Dig down another level and we start to discover some of the previously unknown. For example- higher levels of awareness of students union officers does not increase trust in or value of students unions as the vehicle for student voice. Some students dislike and are confused by the breadth of activity and services their students union offer. That where elected representatives are seen as self-serving the Unions purpose is undermined. And respondents who had been a course rep were not significantly more likely to score the union well on Q26 (!).
Lessons from the research
In this piece I am not going to share the data or report in detail on the key findings. We want to avoid any suggestion that we are publishing and distributing material at a time when the NSS is live in a way which might influence the survey responses and break HEFCE rules. We will however be publishing a report both for students unions and for a wider audience of interested sector agencies and stakeholders in May.
In the mean time what we can share with Unions is some things that I think the data suggests we need to consider locally;
- How do you prioritise campaigning and policy work on mental health, employability and the quality of teaching over and above everything else? Students told us it overwhelmingly these were the key issues.
- How do you communicate to students your institutional level wins and also the representations you make (even when they don’t derive an outcome or impact). Students want us to represent their needs on, for example, accommodation costs and if we don’t tell them just how hard we are trying to change the market students assume we are either not articulating their need at all or we are part of the barrier to solving the problem.
- We need to build relationships with departmental level actors. Students trust them and talk to them and if those staff don’t know the union, or worse dislike it based on a previous experience, they definitely won’t sign post it and may even actively discourage students from engaging with us.
- We all have democracy and representation capacity, but how many of us have policy capacity? Increasingly our job will be to solve complex problems notjust oppose cuts or argue for more money.
- Students don’t like elections, particularly in the current political climate. They don’t like the process and they see the narrative negative, deceitful and self-serving. The elections are going on just as students are asked to complete the NSS. We either have to change the tone and tenor of elections if they are running simultaneously or stop them dominating the second term at the expense of actual representation work.
- How can we define ‘academic interests’. Reframe and market things you do in regards to how it contributes to ‘academic experience’. Just looking at course reps won’t cut it- looking at well-being work broadly, at the personal development and networks that are developed through student opportunities, at representative work in the broadest sense, at outreach, retention, attainment influence.
Lessons as a sector
As well as things that need looking at locally, there are some issues that we should consider collectively as a sector.
- How do we ensure that student voice is a critical factor in putting student interests at the heart of the policy narrative nationally and institutionally?
- As it gets set up, how do we ensure that student movement-organised student voices are heard by the new Office for Students (rather than it deciding it can identify and act on students’ interests independently of student voice?)
- In a decade likely to be dominated by metrics, how can we develop good quality metrics for assessment of the other 2 charitable objects for Students Unions? Can we agree these and all embed them in our annual surveys or within the NSS optional questions for example? If we don’t find a way of measuring our core objects then they will not be recognised by our key funders and students.
- How do we review NSS on an ongoing basis and use it to plan the next 10 years of strategic development and growth for SU’s? If the last 6 years saw us use NSS Q23 to build new buildings, expand and shape our marketing, focus our trustees and sabbaticals- then how will we use Q26? Will it identify new ways to gather and use student insight? Will it revolutionise the way we provide advocacy and advice? Will it change the way we run our campaigns?
I don’t argue that NSS is a perfect science, and I definitely don’t believe in NSS as a justification for fee increases. I do believe that history shows us that good research used by great representatives can be used for driving improvements to student satisfaction and securing investment. There’s an exciting year ahead and I intend to be trying to understand the opportunities and challenges in the eyes of our students. I also hope that we can step up to the plate collectively and prove that University and government policy being done in the ‘student interest’ needs to incorporate legitimate student voice. After all, as we all know locally- it’s not an either/or, and when satisfaction metrics and elected reps combine we know that magic happens and the student experience gets better.