Press "Enter" to skip to content

Tag: Assessment

Speedwell OSCE Training

Some further on-site training from Speedwell today, this time on how the tool can be used to deliver OSCE and MMI testing – that’s observations of clinical practice and multiple-mini interviews which we use to interview potential medical students. Training covered both configuration and live marking, including how to manage breaks and how to have a spare iPad for a non-configured marker to be able to step in.

We also learned about some new features coming to Speedwell which sound pretty good – the ability for multiple markers to moderate and agree a final mark to record in the system, and ‘killer questions’ which means that students have to pass the specified question as well as the exam / interview as a whole.

Leave a Comment

Speedwell Train the Trainer – Advanced

MCQ Exams Meme

The team and I had follow-up webinar training from Speedwell today recapping some of the basic functionality now that we’ve been using if for a few months, and looking at some of the more advanced features which are currently available, and some which are going to be available to us from next week when we upgrade to the latest version of the web app. This will relocate much of the functionality of the admin system, such as checking student performance and running reports, to the system which end users (academics) access through the browser.

Leave a Comment

Speedwell Training

Well, that took a while, but we are finally ditching our antiquated EDPAC forms for high stakes MCQ style exams, and we didn’t go for either Examsoft or Respondus, but Speedwell.

This was our main training session on the system where we had a trainer from Speedwell onsite for the day to run through all aspects of the system with us, from initial configuration to creating questions and exams. We will also be deploying the Safe Exam Browser as part of this project.

Leave a Comment

LTA Workshop: Running up That Hill


Listening to this while reading this post will have no therapeutic value

This was the second Learning and Teaching Academy workshop I attended this semester, to give it it’s subtitle: Engaging Students to Learning Through Teaching and Assessment. (The big boss, who put the programme together, has a penchant for naming events with a musical theme.)

The workshop was given by Prof. Terry Young, Emeritus Professor of Healthcare Systems at Brunel University, who talked about his experience in academia over the past 15 or so years, having come from industry where he spent a similar length of time, giving him an interesting take on the academic world and conventional practice. On assessment, he asked us to consider what are the actual tangible benefits of, for example, spending time on decided on whether or not a given piece of work is worth 82% or 85%, and could this be time better spent elsewhere? Terry argues that there is little value in time spent this way, instead advocating for threshold based marking, first deciding whether a piece of work is a pass or a fail, and then asking either if it’s a fail, is it a good fail and can the student therefore be guided to a pass in future assignments, or if it’s a pass, is it just a pass or a good pass?

Terry also reflected on the nature of work academics are going to need to do over the next 20-30 years in the face of changes driven by automation and artificial intelligence. He concluded that there are three key tasks which will continue to be necessary: writing and specifying the requirements for programmes and assessments, curating and filtering content, and working closely with students to ensure their development and wellbeing.

Leave a Comment

Respondus Demonstration

lockdown_browser

Following ExamSoft last week, today it was Respondus who gave us a demonstration of their software.

Their quiz tool is Respondus 4, which was described as a legacy product, and it did look old. It was demonstrated running on a Windows 7 machine which is sufficiently old now that when I see Windows 7 I wonder why, does it not work on 10? Despite that, Respondus integrates with a number of VLEs and mirrors the available quiz questions types and settings which are available there. Importing and exporting from text files and Word documents was demonstrated and it seemed to work pretty well, though questions and answers have to be in exactly the right format to be recognised. I’m not sure why we would use this over using the quiz tool directly in Canvas though, and it doesn’t give us something that can replace the EDPAC system.

That comes instead from their LockDown Browser product, the one we were interested in. This allows you to set quizzes that can only be taken through LockDown Browser, a stripped down web browser which only allows access to the VLE and once the quiz begins blocks students from opening any other applications or webpages. I was a little concerned about accessibility as it relies on user’s own screen reading software and blocks certain keyboard shortcuts. Nevertheless, it seems to be popular in UK HE so it can’t be too bad.

And then there was the weird one, Monitor, which they tried to sell alongside LockDown Browser. Monitor is designed to be used for remote invigilation, and does so by recording from students’ webcams. On starting up Monitor students have to take a photo and show their university ID for verification purposes, and then Monitor will record them through the duration of the quiz and flag up any ‘unusual’ practices if detected, e.g. going away from the computer or someone else coming into the picture, which then have to be reviewed by a tutor. Recordings are stored online for up to five years on Amazon’s web services. I didn’t quite get a clear answer on whether or not they have access to a data centre in the UK / EU. Is it just me or does this all sound a bit creepy? I also didn’t get a clear answer on whether or not any UK / EU customers were using Monitor. They bundle 200 free licenses of Monitor with LockDown Browser, so there was a fudged ‘yes’, leaving open the possibility that although institutions have Monitor they aren’t using it. Bizarrely they have a completely different pricing model for LockDown Browser and Monitor, and then there are the technical problems. All of the webcam recording and playback functionality uses Flash which Adobe are finally killing in 2020. I asked about their plans on migrating to another solution and they couldn’t answer that either, saying it was all down to Amazon.

We’ll never get Monitor. I can’t imagine any UK university using it. We may get LockDown Browser. The third system demonstration we’ve had as part of this project is Speedwell, but I missed that one as I had another meeting. Other solutions are also under investigation.

Leave a Comment

ExamSoft Demonstration

edpac

We’re looking at options for a secure eAssessment system that would be able to replace our archaic EDPAC forms, and ExamSoft were the first company to provide a demonstration and discussion for us this morning. For those who have no idea what I’m talking about, EDPAC forms are the old pink sheets that you complete by penciling in a cross in the correct answer box (and it does have to be a pencil of the correct weight too!) Those forms are then scanned by a machine we dub the bacon slicer and then we spend hours correcting all the mistakes and typing the comments manually. Everything about it is awful, and we’ve wanted to get rid for years, but there are pockets of use where people are wedded to this system and won’t switch to using MCQs in the VLE. So it is for them that we are looking for a new solution.

ExamSoft’s big selling point is that it can be used on student’s own devices, computers or iPads, which their software can completely lock down for the duration of the exam. This means that we could still get hundreds of students in one secure location all taking the same exam at the same time, one of the arguments in favour of EDPAC. Otherwise, ExamSoft is a fairly standard MCQ system. Questions can be tagged according to the subject or taxonomy of your choice, it can export and import from most other similar systems, integrates with Canvas, etc. I was a little concerned about what seemed to be the limited number of question types – I didn’t see drop-down or calculated questions for example – and I have doubts about how successful it could be as a bring-your-own-device solution for us.

It’s one thing for students to willingly have and use their personal devices to complement their studies, but if we as an institution require them to provide their own kit in order to take exams we’re opening up issues of responsibility as well as imposing an additional financial burden. If someone is bringing in their laptop and it is broken or stolen on the way for example, is that on us? Our insurance? Then there is the issue of technical support, both with the ExamSoft software itself, and logistical considerations such as ensuring that we have sufficient power sockets for the inevitable dead batteries (we don’t) and that our wireless network is robust enough to handle hundreds of simultaneous connections in a small area (it isn’t). Providing our own equipment via something like a laptop safe could offer a solution to some of these problems.

Leave a Comment

CMALT Webinar for New Assessors

cmalt_badge

Those of you who follow me on Twitter may have experienced the pleasure of my little rant as technology utterly failed me for this webinar, but I was at least able to get the recording working a little while later. (I will not be defeated by Java!) It was a very enlightening session on the process and practicalities of assessing the portfolios of CMALT candidates. I’ve actually already done my first one for a portfolio review a few weeks ago, so this was timely, and I have as a result of this webinar now signed up to become an assessor of regular portfolios too.

Unrelated, but ALT have also recently released digital badges for use in portfolios, email signatures, etc. Not actual accredited digital badges with metadata, just nice image files.

Leave a Comment

NERAC Training Day

NERAC, the North East Regional Assessment Centre, is based within the University’s Disability Support Service and provides study needs assessments for students at the University and from other universities and colleges in the North East. This training day was to provide an overview of the latest versions of software and hardware which can be made available to assist students with specific needs, and was very helpful in raising my awareness of what is available and which I will cascade to my team.

For literacy support two software packages were discussed, Read&Write and ClaroRead Pro. Both have functionality for converting text to speech, highlighting, conversation to other formats including MP3, and scanning tools. Read&Write tends to be the preferred software with students, partly for its better spell checker which can check as you type and pick up on phonetical spelling errors, e.g. suggesting ‘enough’ for ‘anuf’.

There was an excellent presentation on hearing impairment which discussed the different types of hearing aids which are available and how these can be complimented by technology such as induction loops and the Roger Pen. Two software packages for audio note taking were introduced, Sonocent Audio Notetaker and Notetalker. Both packages allow students to annotate presentations, e.g. PowerPoint files, with audio recorded notes, and have text to speech functionality for the conversation of recorded presentations, though it was noted that these rarely work as well as intended in real world scenarios due to background noise.

Next, two tools for mind mapping were discussed, MindView and Inspiration. MindView was noted as being the generally preferred package at this time, as it has a familiar Office style ribbon toolbar and some nice features such as being able to add multiple notes and attachments to each branch, collaborative working, a citation tool, and a Gantt chart tool. Inspiration offers a word guide which can suggest synonyms and definitions, a presentation tool, and mobile apps, though it was noted that these were fairly basic.

Finally, software for screen reading and magnification were presented – Supernova and ZoomText. Both tools can do screen magnification, and ZoomText is able to apply different colour schemes to open windows and applications, as well as to the mouse cursor and pointer. It also has a feature called AppReader which can convert text to speech and can reflow the text in a magnified window as it is reading it out. Also demonstrated was Readit which can scan images and convert to text using optical character recognition. This also works with PDF files in which the pages are image files rather than text, useful for older journals which have only had simple scans. Readit can export to various formats, including Word and MP3.

Leave a Comment

PG Cert AP: Day 15

The final taught day on the PG Cert was for the assessment module, EDPM06, and was about how assessment reflects and can influence pedagogy. We were advised to set assessments which are inclusive of all rather than targeting perceived needs of particular groups, but be ready and flexible enough to meet any specific needs which may emerge. This led to a discussion about equality, especially of access to HE, and social justice. Burke’s book, The Right to Higher Education, was recommended for follow up reading in this area.

Finally, there was some discussion and clarification on the assessments for this module itself. These are to write a reflective report showing how your practice has been influenced by what has been taught on this module, and to write two critiques of assessments which you have set or been given, again based on what you have been taught here.

Leave a Comment

PG Cert AP: Day 12

Second day of the Assessment and Feedback for Learning module began with a discussion on the purpose of assessment which is at least partly about gatekeeping and assessing fitness to practice, especially in subjects such as medicine. Expanding out from there we discussed how assessment reflects the needs and demands of wider society and how this has been changing in response to the marketisation of higher education.

There was an interesting side discussion at one point about implicit assessment and how this can distract students. One person talked about how this had manifested on their module, with students believing that there was a hidden quota on the number of students who were supposed to pass and fail. Rather than concentrating on the assessment task at hand they spend a great deal of time in discussions amongst themselves trying to work out this non-existent pass-fail ratio.

In the afternoon we discussed the differences between formative and summative assessment, and how to use assessment to achieve effective learning and learner gain. That, we concluded, comes best from formative assessments, but these take a lot of time and effort and exist in tension with students preference for summative assessment and preoccupation with grades, a possible result of the changing culture which marketisation has brought about.

Leave a Comment