This was our main training session on the system where we had a trainer from Speedwell onsite for the day to run through all aspects of the system with us, from initial configuration to creating questions and exams. We will also be deploying the Safe Exam Browser as part of this project.
Listening to this while reading this post will have no therapeutic value
This was the second Learning and Teaching Academy workshop I attended this semester, to give it it’s subtitle: Engaging Students to Learning Through Teaching and Assessment. (The big boss, who put the programme together, has a penchant for naming events with a musical theme.)
The workshop was given by Prof. Terry Young, Emeritus Professor of Healthcare Systems at Brunel University, who talked about his experience in academia over the past 15 or so years, having come from industry where he spent a similar length of time, giving him an interesting take on the academic world and conventional practice. On assessment, he asked us to consider what are the actual tangible benefits of, for example, spending time on decided on whether or not a given piece of work is worth 82% or 85%, and could this be time better spent elsewhere? Terry argues that there is little value in time spent this way, instead advocating for threshold based marking, first deciding whether a piece of work is a pass or a fail, and then asking either if it’s a fail, is it a good fail and can the student therefore be guided to a pass in future assignments, or if it’s a pass, is it just a pass or a good pass?
Terry also reflected on the nature of work academics are going to need to do over the next 20-30 years in the face of changes driven by automation and artificial intelligence. He concluded that there are three key tasks which will continue to be necessary: writing and specifying the requirements for programmes and assessments, curating and filtering content, and working closely with students to ensure their development and wellbeing.
Their quiz tool is Respondus 4, which was described as a legacy product, and it did look old. It was demonstrated running on a Windows 7 machine which is sufficiently old now that when I see Windows 7 I wonder why, does it not work on 10? Despite that, Respondus integrates with a number of VLEs and mirrors the available quiz questions types and settings which are available there. Importing and exporting from text files and Word documents was demonstrated and it seemed to work pretty well, though questions and answers have to be in exactly the right format to be recognised. I’m not sure why we would use this over using the quiz tool directly in Canvas though, and it doesn’t give us something that can replace the EDPAC system.
That comes instead from their LockDown Browser product, the one we were interested in. This allows you to set quizzes that can only be taken through LockDown Browser, a stripped down web browser which only allows access to the VLE and once the quiz begins blocks students from opening any other applications or webpages. I was a little concerned about accessibility as it relies on user’s own screen reading software and blocks certain keyboard shortcuts. Nevertheless, it seems to be popular in UK HE so it can’t be too bad.
And then there was the weird one, Monitor, which they tried to sell alongside LockDown Browser. Monitor is designed to be used for remote invigilation, and does so by recording from students’ webcams. On starting up Monitor students have to take a photo and show their university ID for verification purposes, and then Monitor will record them through the duration of the quiz and flag up any ‘unusual’ practices if detected, e.g. going away from the computer or someone else coming into the picture, which then have to be reviewed by a tutor. Recordings are stored online for up to five years on Amazon’s web services. I didn’t quite get a clear answer on whether or not they have access to a data centre in the UK / EU. Is it just me or does this all sound a bit creepy? I also didn’t get a clear answer on whether or not any UK / EU customers were using Monitor. They bundle 200 free licenses of Monitor with LockDown Browser, so there was a fudged ‘yes’, leaving open the possibility that although institutions have Monitor they aren’t using it. Bizarrely they have a completely different pricing model for LockDown Browser and Monitor, and then there are the technical problems. All of the webcam recording and playback functionality uses Flash which Adobe are finally killing in 2020. I asked about their plans on migrating to another solution and they couldn’t answer that either, saying it was all down to Amazon.
We’ll never get Monitor. I can’t imagine any UK university using it. We may get LockDown Browser. The third system demonstration we’ve had as part of this project is Speedwell, but I missed that one as I had another meeting. Other solutions are also under investigation.
We’re looking at options for a secure eAssessment system that would be able to replace our archaic EDPAC forms, and ExamSoft were the first company to provide a demonstration and discussion for us this morning. For those who have no idea what I’m talking about, EDPAC forms are the old pink sheets that you complete by penciling in a cross in the correct answer box (and it does have to be a pencil of the correct weight too!) Those forms are then scanned by a machine we dub the bacon slicer and then we spend hours correcting all the mistakes and typing the comments manually. Everything about it is awful, and we’ve wanted to get rid for years, but there are pockets of use where people are wedded to this system and won’t switch to using MCQs in the VLE. So it is for them that we are looking for a new solution.
ExamSoft’s big selling point is that it can be used on student’s own devices, computers or iPads, which their software can completely lock down for the duration of the exam. This means that we could still get hundreds of students in one secure location all taking the same exam at the same time, one of the arguments in favour of EDPAC. Otherwise, ExamSoft is a fairly standard MCQ system. Questions can be tagged according to the subject or taxonomy of your choice, it can export and import from most other similar systems, integrates with Canvas, etc. I was a little concerned about what seemed to be the limited number of question types – I didn’t see drop-down or calculated questions for example – and I have doubts about how successful it could be as a bring-your-own-device solution for us.
It’s one thing for students to willingly have and use their personal devices to complement their studies, but if we as an institution require them to provide their own kit in order to take exams we’re opening up issues of responsibility as well as imposing an additional financial burden. If someone is bringing in their laptop and it is broken or stolen on the way for example, is that on us? Our insurance? Then there is the issue of technical support, both with the ExamSoft software itself, and logistical considerations such as ensuring that we have sufficient power sockets for the inevitable dead batteries (we don’t) and that our wireless network is robust enough to handle hundreds of simultaneous connections in a small area (it isn’t). Providing our own equipment via something like a laptop safe could offer a solution to some of these problems.
Those of you who follow me on Twitter may have experienced the pleasure of my little rant as technology utterly failed me for this webinar, but I was at least able to get the recording working a little while later. (I will not be defeated by Java!) It was a very enlightening session on the process and practicalities of assessing the portfolios of CMALT candidates. I’ve actually already done my first one for a portfolio review a few weeks ago, so this was timely, and I have as a result of this webinar now signed up to become an assessor of regular portfolios too.
Unrelated, but ALT have also recently released digital badges for use in portfolios, email signatures, etc. Not actual accredited digital badges with metadata, just nice image files.
NERAC, the North East Regional Assessment Centre, is based within the University’s Disability Support Service and provides study needs assessments for students at the University and from other universities and colleges in the North East. This training day was to provide an overview of the latest versions of software and hardware which can be made available to assist students with specific needs, and was very helpful in raising my awareness of what is available and which I will cascade to my team.
For literacy support two software packages were discussed, Read&Write and ClaroRead Pro. Both have functionality for converting text to speech, highlighting, conversation to other formats including MP3, and scanning tools. Read&Write tends to be the preferred software with students, partly for its better spell checker which can check as you type and pick up on phonetical spelling errors, e.g. suggesting ‘enough’ for ‘anuf’.
There was an excellent presentation on hearing impairment which discussed the different types of hearing aids which are available and how these can be complimented by technology such as induction loops and the Roger Pen. Two software packages for audio note taking were introduced, Sonocent Audio Notetaker and Notetalker. Both packages allow students to annotate presentations, e.g. PowerPoint files, with audio recorded notes, and have text to speech functionality for the conversation of recorded presentations, though it was noted that these rarely work as well as intended in real world scenarios due to background noise.
Next, two tools for mind mapping were discussed, MindView and Inspiration. MindView was noted as being the generally preferred package at this time, as it has a familiar Office style ribbon toolbar and some nice features such as being able to add multiple notes and attachments to each branch, collaborative working, a citation tool, and a Gantt chart tool. Inspiration offers a word guide which can suggest synonyms and definitions, a presentation tool, and mobile apps, though it was noted that these were fairly basic.
Finally, software for screen reading and magnification were presented – Supernova and ZoomText. Both tools can do screen magnification, and ZoomText is able to apply different colour schemes to open windows and applications, as well as to the mouse cursor and pointer. It also has a feature called AppReader which can convert text to speech and can reflow the text in a magnified window as it is reading it out. Also demonstrated was Readit which can scan images and convert to text using optical character recognition. This also works with PDF files in which the pages are image files rather than text, useful for older journals which have only had simple scans. Readit can export to various formats, including Word and MP3.
The final taught day on the PG Cert was for the assessment module, EDPM06, and was about how assessment reflects and can influence pedagogy. We were advised to set assessments which are inclusive of all rather than targeting perceived needs of particular groups, but be ready and flexible enough to meet any specific needs which may emerge. This led to a discussion about equality, especially of access to HE, and social justice. Burke’s book, The Right to Higher Education, was recommended for follow up reading in this area.
Finally, there was some discussion and clarification on the assessments for this module itself. These are to write a reflective report showing how your practice has been influenced by what has been taught on this module, and to write two critiques of assessments which you have set or been given, again based on what you have been taught here.
Second day of the Assessment and Feedback for Learning module began with a discussion on the purpose of assessment which is at least partly about gatekeeping and assessing fitness to practice, especially in subjects such as medicine. Expanding out from there we discussed how assessment reflects the needs and demands of wider society and how this has been changing in response to the marketisation of higher education.
There was an interesting side discussion at one point about implicit assessment and how this can distract students. One person talked about how this had manifested on their module, with students believing that there was a hidden quota on the number of students who were supposed to pass and fail. Rather than concentrating on the assessment task at hand they spend a great deal of time in discussions amongst themselves trying to work out this non-existent pass-fail ratio.
In the afternoon we discussed the differences between formative and summative assessment, and how to use assessment to achieve effective learning and learner gain. That, we concluded, comes best from formative assessments, but these take a lot of time and effort and exist in tension with students preference for summative assessment and preoccupation with grades, a possible result of the changing culture which marketisation has brought about.
Another split day, wearing my student hat in the morning for the core module, and in the afternoon teaching part of the digital technology module, this time with the added pressure of being formally assessed as part of one of the assessments for the core module. It does get rather circular.
The morning session was excellent, far and away the most useful couple of hours I’ve ever spent on assessment. A guest lecturer facilitated an extended and iterative exercise using the seemingly simple task of defining a biscuit as a metaphor for the problems of assessment marking. First we each had to write a definition of a biscuit in 180 characters or less, the length of a Tweet, then the room was split into two groups and each group had to agree a common definition. Then the fun part, a plate of ‘biscuits’ was given to each group and we were tasked with marking them against our definition, placing each within a four point rubric of ‘biscuity’, replicating the undergraduate degree classification system. I was expecting trouble with the Jaffa Cakes, but the viciousness and racism which came about as a result of the shortbread finger took my by surprise. Alas, we were forbidden from removing the more contentious ‘biscuits’ from the equation by eating them.
The afternoon session for EDPM08 covered digital communication and virtual reality technologies and tools. It was this part that was delivered by myself and I was given an hour. I spent the first 30 minutes going through a short presentation I created about the use of virtual, augmented and mixed reality systems in higher education which I based on the microsite I wrote, followed by another 30 minutes or so in which people were able to have a go with some hardware and software which the module leader and I supplied – phone based VR headsets using some VR and AR apps I had found which showcased educational uses such as Anatomyou VR.
There was a bit of pressure on me this time, as my teaching was being formally observed in accordance with university practices and as a requirement for part of one of the assessments in the core module. I felt nervous, feeling that I stumbled over my words a bit more often than I would have liked, and I completely forgot to talk about Google Glass during the AR section, but my observer thought I did fine. I was commended on subject knowledge and use of cultural references to make the presentation interesting, and given good advice which I will be able to use in the future. At one point I did go ‘off script’ and tried to open an external link which took some time to load – I should have been ready with that or else not tried it. I was also advised to end the session with an optional task that people could do afterwards to help embed their learning – a good point, and something I have done in the past.
First day of my optional module, Assessment and Feedback for Learning, began with a discussion of how assessment can be used for learning, rather than as a tool to measure learning. The module has this concept at its core and, as such, the main assessment of this module is to critically analyse two assessments that you have used or written previously. There is also a second assessment, to write a personal reflective report on how you have found the problem based learning approach taken in this module, and how what you have learned impacts on your own academic practice. Very meta.
After setting out the learning objectives and the assessments of the modules, the remainder of the day was spent discussing the various factors and contexts which influence how assessments are set and marked. These included how student expectations have changed as a result of the marketisation of the sector, the university’s generic assessment criteria and how that relates to the learning outcomes on individual modules, and the cascading down of risk onto lecturers, e.g. pressures around graduate employability and how that influences the assessments which are set.
We also discussed the difference between formative and summative assessment, and how and why students often see formative assessments as options. There was a little about Foucault’s ‘regimes of truth’ (got to love a bit of Foucault!), and the concepts of the hidden curriculum and expectations – that everyone has a certain baseline IT literacy for example.