In early July a crack team of Student Movement staff attended the National Centre for Research Methods’ annual festival. All this week we’ll be carrying reflective articles on research in Students’ Unions from the team that attended.
Researching student living: mining the social networks: Sam Nichols, Education Researcher, Nottingham SU (email@example.com)
In this piece I want to talk about the ethical issues of social network data mining, and whether we as SUs can use social network data as a valid research tool, without having to collect primary data. Firstly: ethics. Lisa Suguira at the University of Southampton’s PhD research topic is the behaviour of individuals in relation to buying medicine online, and whether the use of the web, to some extent, saw a rise in criminal conduct on the individual’s part. Part of her methodology was Virtual Ethnography – or to put it more bluntly, mining the data people were putting out into the public sphere for research purposes. There are ethical issues in relation to this, as the private and public become far more blurred: can there be an unwilling participant, and does the individual need to be informed that their data is being used?
The individual is already putting their views, values and behaviours into the public sphere, so arguably it is fair game to mine it: it is different from searching records, and recording conversations. We are at a time when people are living their lives more openly, and it is only correct that we can start to use some of that data to influence our practice: if we study this ethically, then it’s a great time to be a researcher.
With our own students living their lives more publicly, but outside of more traditional democratic processes, it is advisable that we start to use the data we have available more effectively, and more often than not we see this as an act of primary data collection – write a survey, organise a focus group or, if we’re more flush with money then do something else. However, our students are probably articulating these views publicly elsewhere: if we have start to mine the social networks, can we more accurately capture what our students think about their landlords, or how our students are learning, than if we collect the data ourselves. Chances are that those views are more honest as well, so why not use it?
As I say this is a conundrum: SUs want to be ethical, and do exist within an academic community that has ethical research at its heart: it’s only right that our methods are scrutinised by people more knowledgeable and experienced than ourselves. If we take the ethics into account is there any reason why we can’t start to look more creatively at the data that is out there: we may just save ourselves the expense of data collection (by ploughing our resource into analysis). We are often talking about ‘being where students are’ in relation to providing our services, but is there any reason why the only part of our research that is online is normally hosted on SurveyMonkey?
Imitation Games: Alan Roberts, Policy Development Manager, NUS (Alan.firstname.lastname@example.org)
“We’d like you to pretend to be a woman.”
You know the Turing test? The one where a a person asks a question to a person and a computer and has to guess which one’s the real person. The Imitation Game is essentially the Turing Test for people. It tests the practice language of given groups of people. The research team from Cardiff University described this as a “test of interactional expertise”.
To explain: one imitation game asked Blind People to pretend to be sighted people and sighted people to pretend to be blind. In each case, the judge, who asks the questions, is from the community being tested – so a blind judge asked questions of a blind person and a sighted person pretending to be blind and was asked to make a judgement as to who was telling the truth.
The first killer stat that really caught me: 86% of blind people were successful in pretending to be sighted; 13% of sighted people were successful in pretending to be blind. The potential reason? Blind people are already immersed in the practice language of the sighted, whereas sighted people are not accustomed to the practice language of the blind. In other tests, this was true of any group that was in the minority or experienced barriers compared to the second group. For me this immediately spoke to the work that NUS did last year on invisible privilege.
Method aside, this research (currently focussing on perceptions of sexuality in the UK and Poland, and perceptions of race in South Africa) could do amazing things in terms of understanding cultural barriers between identities and in the blunt, but perfectly phrased words of a delegate “make people less homophobic.” On this latter point, they will hopefully test this assumption against Harriet’s scale of attitudes towards LGBT.
This alone was exciting, but next was the analysis of the data. Thus far, the team have conducted over 12,000 games, with reams of participant-generated questions and answers. They have analysed the questions that judges have been asking. The important thing about these questions is that they say much more about the community in question. The questions asked set the parameters, the rules, the conditions of the community, commenting on identity, self-perception and values.
I have some keen views on how we could make use of this method– the experience of a home student, of a Management student and an Engineering student, of a Met student and of a research-intensive student, even of sabbatical officers and students!
This is a serious and creative research project and method, and this is my take home topic for NUS and the student movement. They have an app (“Masquerade“), and just in case you were wondering – 78% of men in our group were able to pass off as women versus around 40% of women passing off as men. As of now, I have no functioning hypothesis as to why this is the case. Maybe we should try it again?