Warrell, H. (2105) Students under surveillance Financial Times, July 24
Applications of learning analytics
This is a thoughtful article in the Financial Times about the pros and cons of using learning analytics, drawing on applications from the U.K. Open University, Dartmouth College in the USA, student monitoring service Skyfactor, and CourseSmart, a Silicon Valley start-up that gives universities a window into exactly how e-textbooks are being read.
The UK Open University is using learning analytics to identify students at risk as early as a week into a course.
An algorithm monitoring how much the new recruits have read of their online textbooks, and how keenly they have engaged with web learning forums, will cross-reference this information against data on each person’s socio-economic background. It will identify those likely to founder and pinpoint when they will start struggling. Throughout the course, the university will know how hard students are working by continuing to scrutinise their online reading habits and test scores.
The article also discusses Dartmouth College’s mobile phone app which:
tracks how long students spend working, socialising, exercising and sleeping. The information is used to understand how behaviour affects grades, and to tailor feedback on how students can improve their results.
The article also tries to get a handle on student attitudes to this form of monitoring or surveillance. Not surprisingly, students appear to be somewhat ambiguous about learning analytics and differ in their acceptance of being monitored.
Rationalisations
What was particularly interesting is the range of justifications given in this article for monitoring student behaviour through data analysis:
- the most obvious is to identify students at risk, so that appropriate interventions can be made. However, there weren’t any examples given in the article of appropriate interventions, highlighting the fact that it is one thing to identify a problem and quite another to know what to do about it. For instance we know that from previous research that students from particular socio-economic backgrounds or students from particular ethnic backgrounds are potentially more at risk than others. What does this mean though in terms of teaching and learning? If you know this is a challenge before students start studying, why wait for learning analytics to identify it as a problem?
- the next argument is the need to ensure that the high investment each student (or their parents) makes in higher education is not wasted by a failure to complete a program. Because of the high cost, fear of failure is increasing student stress. At Dartmouth, a third of the undergraduate student body saw mental health counsellors last year. However, the solution to that may not be better learning analytics, but finding ways to finance students that don’t lead to such stress in the first place;
- another rationale is to reduce the financial risk to an institution. The Chief Technology Officer at Skyfactor argues that with revenues from tuition fees of around $25,000+ per student per annum in the USA, avoiding student drop-out is a financial necessity for many U.S. institutions. However, surely there is a moral necessity as well in ensuring that your students don’t fail.
Making sense of learning analytics
The Open University has always collected data on students since it started. In fact, McIntosh, Calder and Smith (1976) found that statistically, the best predictor of success was whether a student returned a questionnaire in the first week of a course, as this indicated their commitment. It still didn’t tell you what to do about the students who didn’t return the questionnaire. (In fact, the OU’s solution at the time was not to count anyone as an enrolment until they had completed an assignment two weeks into the course – advice that MOOC proponents might pay attention to).
As with so many technology developments, the issue is not so much the technology but how the technology is used, and for what purposes. Conscientious instructors have always tried to track or monitor the progress of individual students and learning analytics merely provides a more quantitative and measurable way of tracking progress. The issue though is whether the data you can track and measure can offer solutions when students do run into trouble.
My fear is that learning analytics will replace the qualitative assessment that an instructor gets from, for instance, participating in a live student discussion, monitoring an online discussion forum, or marking assignments. This is more likely to identify the actual conceptual or learning problems that students are having and is more likely to provide clues to the instructor about what needs to be done to address the learning issues. Indeed in a discussion the instructor may be able to deal with it on the spot and not wait for the data analysis. Whether a student chooses to study late at night, for instance, or only reads part of a textbook, might provide a relatively weak correlation with poorer student performance, but recommending students not to stay up late or to read all the textbook may not be the appropriate response for any individual student, and more importantly may well fail to identify key problems with the teaching or learning.
Who gets to use the data?
Which brings me to my last point. Ruth Tudor, president of the Open University’s Students’ Association, reported that:
when the data analytics programme was first mooted, participants were “naturally” anxious about the university selling the information it collected to a third party.
The OU has given strong assurances that it will not do this, but there is growing concern that as higher education institutions come to rely more on direct funding and less government support, they will be tempted to raise revenues by selling data to third parties such as advertisers. As Andrew Keen has argued, this is a particular concern about MOOCs, which rely on other means than direct fees for financial support.
Thus it is incumbent on institutions using learning analytics to have very strong and well enforced policies about student privacy and use of student data. The problem then though is that can easily lead to instructors being denied access to the very data which is of most value in identifying student learning difficulties and possible solutions. Finding the right balance, or applying common sense, is not going to be easy in this area.
Reference
McIntosh, N., Calder, J. and Swift, B. (1976) A Degree of Difference New York: Praeger