There’s no escaping the robots, resistance is futile. ChatGPT has been a gathering storm since the back end of last year, and Sunderland cannot escape the pull. However, we need to learn more about this and related technology in order to be able to be able to provide a thoughtful and measured response to it for our staff and students. To which end, I signed up for this session drawing together senior academics from across UK HE to share thoughts and experience. I have a few more such sessions coming in the next few weeks, so I’ll wait and share my thoughts in a dedicated post when I have the time and space to synthesis them.
And lo! November 2022 did bring forth the first, proper, ALT North East User Group since The Before Times. Though we did have a catch-up meeting in January to check-in and talk about how the pandemic has affected us all.
I was unable to make any of the management meetings to help organise and set the agenda, and so was duly punished by being putting up first to give me now almost routine talk about how our pilot year with Studiosity has gone.
Next up was Newcastle University and how they have rolled out digital assessment. Interestingly, they made a decision not to implement any kind of online proctoring software over the pandemic, a decision I very much support. They have been using, and are now scaling up, the use of Inspera for in-person exams. This was chosen over others for its ability to save local copies of exams – which it does every 6 seconds – as a contingency against network outage, and which in extreme cases can be retrieved from the computer as an encrypted file and uploaded on the students’ behalf. They are using a bring-your-own-device model, with power supply available for around 10% of the exam room capacity, and a laptop loan scheme available for 5%, which have been sufficient to cover them. For improved convenience, they are now looking at providing portable power banks rather than running extension cables around the room.
Next, my old muckers from Northumbria talked about their digital literacy scheme which sees TEL colleagues mentoring staff on digital technologies, and an expanded IT Place which now features TEL as well as IT staff, supported by a range of asynchronous content with certificates for staff who complete set courses. They are looking at digital badges to replace / complement this moving forwards.
After lunch, Durham talked about their experience of dual-mode teaching, including the use of Owl telepresence devices, as featured in the pic above which I gratuitously pinched from their website (please don’t sue, I have no money). It was an interesting experience, mixed. A conclusion from the learning technologies team was that they were great for meetings and small rooms, but the mics and cameras weren’t up to the job in larger teaching spaces. That didn’t stop their IT department from purchasing them en masse and kitting out every room though! Ah, classic IT.
Finally, we ended with a roundtable discussion on the use of student data. Again, Newcastle I feel are ahead of the curve here in banning the use of predictive analytics outright. Durham talked about their experience of the Blackboard feature which allows automated messages to be sent to students based on performance – they turned it off. They felt it was problematic for student motivation as the messages didn’t provide sufficient (any?) contextual information for students.
They’re experimenting with the format a little, as this week saw three 5 minute talks from different departments on the subject of peer assessment. One theme that came out of all the sessions was the need to use anonymised marking for student confidence in the fairness of the process. I was particularly interested in Robert Gillanders experience of using negative marking as a motivator – for every 3% that students deviated from the mean in their peer marking, their own grade was reduced by 1%. I’m very curious about how this worked in practice and how ethical considerations were handled, and Gillanders has published a paper on this which I’m going to have to read.
The second session was on learner analytics from Cormac Quigley who talked about how they have taken data from multiple sources, only one of which was Moodle, and combined in Microsoft Power BI to produce a comprehensive learning analytics system, with the data and reports made available to staff via Teams. However, they also talked about the basic reporting functionality of Moodle, how you can combine grade book functionality with progress bars to create effective results for staff and students.
The full Moodle Munch archive is available online here.
This was Bob Harrison’s inaugural lecture as a Visiting Professor at the University of Wolverhampton. Bob has been in education for over 50 years, and I have known his name in Ed Tech circles for a long time.
His talk was on the dangers of over-emphasising the power of technology as a solution to the problem of online and distance education, and the need to continually relearn the lessons that successful learning, no matter whatever physical distances may be involved, needs to be driven by the learner as the active agent in the learning process, supported by well-designed content delivered by caring and competent teachers. And if I’ve mangled Bob’s thesis in this summary, you can read it more eloquently in his own words in this article, Why there is nothing remote about online learning, published last year. And for an example of how you can’t magically improve online learning just by throwing money and technology at the issue, Wired’s article on the ‘LA iPad debacle’ is a good read.
I thoroughly enjoyed Bob’s lecture, and his dismantling of technological solutionism, neoliberalism in education, and his barely checked scorn for the Department for Education and their fixation on remote teaching.
The screenshot which I grabbed to illustrate this post shows a continuation of the theme of learners as the active agents of learning in the most influential learning theories spanning the past century.
The January Moodle Munch recording
Welcome to 2021, folks! Let’s hope it’s going to be a better year for all.
Today marked the return of Moodle Munch, with two presentations as always. Mark Glynn from Dublin City University began by discussing some tools and techniques they are using to add some gamification to online modules to improve student engagement. First, the use of formative and summative quizzes, but not just quizzes, the pedagogy around their use, emphasising the opt-in nature of the summative component and giving students the ‘freedom to fail’. Mark presented some interesting research showing both strong positive student feedback and improved pass marks on the formative assessment component with the group of students who had engaged with the summative quizzes. As it should be, but it’s nice to see such strong evidence! Mark then showed us the Level Up Plus plugin for Moodle which can be used to add gamification elements to module spaces, such as progress bars and leaderboards.
The second presentation was from Nic Earle at the University of Gloucestershire who demonstrated the custom electronic marking and assessment system they have developed for managing student assessments, and how it integrates with their Moodle and student records system. They switched over to this system in 2017 wholesale, and again Nic was able to show very positive results demonstrating increased use of the VLE (even before the pandemic), and improved NSS scores.
As always, the presentations were recorded and I have embedded the YouTube above this post.
Some further on-site training from Speedwell today, this time on how the tool can be used to deliver OSCE and MMI testing – that’s observations of clinical practice and multiple-mini interviews which we use to interview potential medical students. Training covered both configuration and live marking, including how to manage breaks and how to have a spare iPad for a non-configured marker to be able to step in.
We also learned about some new features coming to Speedwell which sound pretty good – the ability for multiple markers to moderate and agree a final mark to record in the system, and ‘killer questions’ which means that students have to pass the specified question as well as the exam / interview as a whole.
The team and I had follow-up webinar training from Speedwell today recapping some of the basic functionality now that we’ve been using if for a few months, and looking at some of the more advanced features which are currently available, and some which are going to be available to us from next week when we upgrade to the latest version of the web app. This will relocate much of the functionality of the admin system, such as checking student performance and running reports, to the system which end users (academics) access through the browser.
This was our main training session on the system where we had a trainer from Speedwell onsite for the day to run through all aspects of the system with us, from initial configuration to creating questions and exams. We will also be deploying the Safe Exam Browser as part of this project.
Listening to this while reading this post will have no therapeutic value
This was the second Learning and Teaching Academy workshop I attended this semester, to give it it’s subtitle: Engaging Students to Learning Through Teaching and Assessment. (The big boss, who put the programme together, has a penchant for naming events with a musical theme.)
The workshop was given by Prof. Terry Young, Emeritus Professor of Healthcare Systems at Brunel University, who talked about his experience in academia over the past 15 or so years, having come from industry where he spent a similar length of time, giving him an interesting take on the academic world and conventional practice. On assessment, he asked us to consider what are the actual tangible benefits of, for example, spending time on decided on whether or not a given piece of work is worth 82% or 85%, and could this be time better spent elsewhere? Terry argues that there is little value in time spent this way, instead advocating for threshold based marking, first deciding whether a piece of work is a pass or a fail, and then asking either if it’s a fail, is it a good fail and can the student therefore be guided to a pass in future assignments, or if it’s a pass, is it just a pass or a good pass?
Terry also reflected on the nature of work academics are going to need to do over the next 20-30 years in the face of changes driven by automation and artificial intelligence. He concluded that there are three key tasks which will continue to be necessary: writing and specifying the requirements for programmes and assessments, curating and filtering content, and working closely with students to ensure their development and wellbeing.
Their quiz tool is Respondus 4, which was described as a legacy product, and it did look old. It was demonstrated running on a Windows 7 machine which is sufficiently old now that when I see Windows 7 I wonder why, does it not work on 10? Despite that, Respondus integrates with a number of VLEs and mirrors the available quiz questions types and settings which are available there. Importing and exporting from text files and Word documents was demonstrated and it seemed to work pretty well, though questions and answers have to be in exactly the right format to be recognised. I’m not sure why we would use this over using the quiz tool directly in Canvas though, and it doesn’t give us something that can replace the EDPAC system.
That comes instead from their LockDown Browser product, the one we were interested in. This allows you to set quizzes that can only be taken through LockDown Browser, a stripped down web browser which only allows access to the VLE and once the quiz begins blocks students from opening any other applications or webpages. I was a little concerned about accessibility as it relies on user’s own screen reading software and blocks certain keyboard shortcuts. Nevertheless, it seems to be popular in UK HE so it can’t be too bad.
And then there was the weird one, Monitor, which they tried to sell alongside LockDown Browser. Monitor is designed to be used for remote invigilation, and does so by recording from students’ webcams. On starting up Monitor students have to take a photo and show their university ID for verification purposes, and then Monitor will record them through the duration of the quiz and flag up any ‘unusual’ practices if detected, e.g. going away from the computer or someone else coming into the picture, which then have to be reviewed by a tutor. Recordings are stored online for up to five years on Amazon’s web services. I didn’t quite get a clear answer on whether or not they have access to a data centre in the UK / EU. Is it just me or does this all sound a bit creepy? I also didn’t get a clear answer on whether or not any UK / EU customers were using Monitor. They bundle 200 free licenses of Monitor with LockDown Browser, so there was a fudged ‘yes’, leaving open the possibility that although institutions have Monitor they aren’t using it. Bizarrely they have a completely different pricing model for LockDown Browser and Monitor, and then there are the technical problems. All of the webcam recording and playback functionality uses Flash which Adobe are finally killing in 2020. I asked about their plans on migrating to another solution and they couldn’t answer that either, saying it was all down to Amazon.
We’ll never get Monitor. I can’t imagine any UK university using it. We may get LockDown Browser. The third system demonstration we’ve had as part of this project is Speedwell, but I missed that one as I had another meeting. Other solutions are also under investigation.