Press "Enter" to skip to content

Tag: CPD

UoS Learning and Teaching Conference 2023

Learning and Teaching Conference, 2023
The big boss up on stage, doing the introductions

Another out of this world conference this year, but alas nobody who was one degree of separation from walking on the moon this time, as our attention instead turned to… yes, you guessed it, generative artificial intelligence.

The morning keynote was given by Thomas Lancaster of Imperial College London who has done a lot of research over the years on contract cheating, and who has now turned his attention to the new AI tools which have appeared over the past year. Interestingly, he commented that essay mill sites are being pushed to students as much as they ever have, but I suspect that these agencies are now themselves using generative AI tools to displace already low paid workers in the developing world who were previously responsible for writing assignments on demand for Western students.

The first breakout session I attended was ‘Ontogeny: Mentoring students to succeed in a world of AI’ by Dr Thomas Butts and Alice Roberts who discussed how medical students are using GAI and the issues this is causing in terms of accuracy, as these models are often presenting wrong information as truth, which has particularly serious consequence in medicine. There was an interesting observation on culture and social skills, that students now seem to prefer accessing the internet for help and information rather than simply asking their teachers and peers.

The second session was ‘Enhancing the TNE student experience through international collaborative discussions and networking opportunities’ by Dr Jane Carr-Wilkinson and Dr Helen Driscoll who discussed the Office for Students’ plans to regulate TNE (trans-national education), though no-one quite seems to know how they are going to do this. Including the OfS. This was an interesting discussion which explored the extent of our TNE provision (I don’t think I had appreciated the scale before, over 7,000 students across 20 partners), and the issues involved in ensuring quality across the board.

There was also a student panel discussion who were asked about their use of GAI and understanding of the various issues surrounding plagiarism. They demonstrated quite a robust level of knowledge, with many of them saying that they are using ChatGPT as a study assistant to generate ideas, but I did groan to hear one person talk about the "plagiarism score" in Turnitin and how "20% plagiarism is a normal amount", and they don’t worry until it gets higher. The myths penetrate deep.

The final afternoon keynote was given by Dr Irene Glendinning of Coventry University who talked about her research on the factors which lead to plagiarism and cheating. This included a dense slide on various factors such as having the opportunity, thinking they won’t be detected, etc., but nowhere on there were cultural factors identified, and the way that higher education in the UK has been marketized over the recent past. I’ve certainly came across comments along the nature of, if students are paying £9,000 a year on tuition, why not just pay a few hundred more to make assessment easier or guarantee better results? But I’m noticing more and more that people don’t seem to be willing or able to challenge the underlying political decisions anymore.

Leave a Comment

AI in Education: Unleashing Creativity and Collaboration

Word cloud showing positivity towards AI
Word cloud showing some positivity towards AI

This was the University of Kent’s third Digitally Enhanced Education webinar on the topic of AI in education, this time with a focus on how AI can be used positively to support creativity and collaboration. An open poll on our thoughts ran throughout the afternoon, and as you can see from the screenshot above the group was far more optimistic about it all than us doom-saying learning technologists at ALT North East. All of the presentations were recorded and are available on their YouTube channel.

A few themes stood out for me. On how GAI is impacting students, Dr Sam Lau of Hong Kong Baptist University talked about a student survey they have done in which students talked about how they are starting to use GAI tools as a new, ‘better’ type of search engine and teaching assistant. Cate Bateson, Hannah Blair and Clodagh O’Dowd, students at Queen’s University Belfast, reported that students want clarity and guidance from their institutions on where and how they are allowed to use AI tools. This was echoed by Liss Chard-Hall, a study skills tutor, who said that students have reported to her a new reluctance to use tools which already were using AI before ChatGPT, such as Grammarly, because they aren’t sure if it’s allowed by their institution. One person in the chat even commented that they knew of a student who was scared to use the spelling and grammar checker in Word lest they break new university rules about using AI in assessment.

Also from the chat, there was a discussion about which areas of university life are going to be most disrupted. Literature reviews was a big one, as what benefits are there from conducting a complex, time consuming search of literature when you can ask an AI model to do it for you? To which end, I learned about a new tool that claims to be able to do just this: Elicit. Another useful discovery from this session is This Person Does Not Exist which generates photo-realistic images of people.

On impacts in the wider world, Dr Ioannis Glinavos of the University of Westminster made the case that jobs in many areas will become more about verifying information and accuracy, as has happened with translators in the past couple of decades. While it is still a necessary skill, and possible to make a living as a translator, machine translation does the bulk of the work now, with human translators doing post-editing and checking for contextual and cultural relevance.

Finally, Anna Mills from the College of Marin in the US brought ethical questions back to the fore. First, reminding us that these new GAI models are designed to output plausible sounding responses, not truth – they don’t care about truth – hence we all need to be mindful to verify any information sourced from GAI tools. Anna then talked about two facets of “AI colonialism”. First, that GAI models are primarily trained on source texts written in the West (and as we know from stats about who writes Wikipedia articles and Reddit posts for example, we can infer a predominance of certain genders and skin colours too – biases feeding biases…), and second, that content moderation is being outsourced to low paid workers in the developing world, an inconvenient truth that isn’t getting enough attention. Anna’s presentation is available under a CC license and is well worth reading in full.

Leave a Comment

ALT NE User Group: June 2023

A photo of Durham's lightboard in action
Durham University’s Lightboard, a very cool (but smudgy) piece of tech

Hosted by my lovely colleagues at Durham, this ALT North East meeting began with a discussion of the practice of video assessment. I talked through what we do at Sunderland using Canvas and Panopto, covering our best practice advice and talking through the things which can go wrong. The problem of a VLE having multiple tools for recording / storing video was one such headache shared by all of us, no matter what systems we are using.

We then moved on to a discussion about Turnitin, ChatGPT and AI detection, pretty much a standing item now. Dan shared with us a new tool he has come across, which I’m not going to name or share, which uses AI to autocomplete MCQs. A new front has emerged. Some bravery from Northumbria who must be one of the few HEIs to have opted in to Turnitin’s beta checker, and New College Durham are going all in on the benefits of generative writing to help staff manage their workload by, for example, creating lesson plans for them. A couple of interesting experiments to keep an eye on there.

After lunch we had demonstrations of various tools and toys in Durham’s Digital Playground Lab. This included a Lightboard. This is a really cool and simple piece of tech that lets presenters write on a transparent board between them and the camera using UV pens. I came across this a few years ago, before the pandemic I think, but it’s a strange beast. It’s not a commercial system, but open hardware, so anyone can build one for themselves at little cost. Unfortunately at Sunderland, and I suspect at many bureaucracies, this actually makes it a lot harder to get one than just being able to go to a supplier. So it never happened, but at least today I got to see one live.

Another bespoke system demonstrated was a strip of LED lights around the whiteboard controlled through a web app which allows students to discretely indicate their level of comprehension. We had a short tour of the Playground’s media recording room, watched some video recordings of content created in VR to, for example, show the interaction of the magnetic fields of objects, a demonstration of Visual PDE which is an open source web tool for demonstrating differential equations, and Kaptivo, a system for capturing the content of a whiteboard but not the presenter. You can see the Kaptivo camera in the background of my photo, behind the Lightboard.

Leave a Comment

Using AI in Education: A Student Voice

Screenshot showing different results for weather in Egypt on Google and ChatGPT
Screenshot of a search query in Google and ChatGPT

A second session from the University of Kent on AI / ChatGPT, this time student-led. It was good to hear the student voice on these developments, and I found it reassuring that they are identifying the same issues and raising the same concerns as staff. More than one of the student presentations talked about how ChatGPT is already displacing Google and other search engines as the first place they are going to find answers. Like in the screenshot above, which shows the difference in results when searching for the temperature in Egypt from Google on the left, where you get a list of links to follow through, and ChatGPT on the right which provides a far more detailed answer in a written form. The problem is, as identified in one of the presentations but not the other, is that there is no way to verify the data which ChatGPT is presenting as truth. With the Google results you can evaluate the sources and verify against others; ChatGPT is a black box.

There was another good presentation from a student at Northumbria who has done some early research with students who have used ChatGPT to find out why and what they are using it for. The results being that they are mostly using it to check their own knowledge (problematic if you can’t trust ChatGPT to give you true answers), and to generate ideas. They are also using it outside of education to, for example, help write CVs and job applications. This makes me feel like we are always going to be on the defensive, reactive as one student said. While the education sector tries to grapple with the technology and debates banning it on one hand and thoughtful ethical use – while protecting academic integrity – on the other, other areas of society plough on regardless and heedless.

In another presentation a student talked about experimenting with ChatGPT to produce a response to one of their assignments and, echoing Margaret Bearman’s point from the teacher led session last month, they found that the result lacked critical analysis, and believe it would have been a clear fail.

I was pleased to note that ethical issues were being raised by students, though largely in the context of equitable access. With GPT 4 and priority access going behind a paywall, the students who can afford to use it will, and those who can’t will find they have another new and innovative way of being disadvantaged. How long before we see the first university or college purchasing a license for all of their staff and students?

Once again all of the presentations were recorded and are available as a YouTube playlist.

Leave a Comment

UKAT Annual Conference 2023

The University of Sunderland has done a lot of work over the past few years to standardise and professionalise our personal tutoring provision and align it with UKAT, the United Kingdom Advising and Tutoring association. It was from this that the Studiosity project began, and when our Pro VC for Learning and Teaching put out a call towards the end of last year for us to attend and present at this year’s conference en masse, I submitted a proposal for my Studiosity pilot year presentation – now in what I hope is its final form, including impact on attainment and progression for the pilot cohort.

Conference sessions I was able to attend as a participant were:

  • UKAT Curriculum Taster Session, by Karen Kenny of the University of Exeter which allowed me to complete an introductory module.
  • Empowering Under-Represented Voices, by Rachael O’Connor from the University of Leeds which included a discussion on how tutors can reach and support those students.
  • Considerations Around Academic Misconduct, by Luke Jefferies from the University of East Anglia – a really good session that deconstructed notions of ‘cheating’ and discussed some of the unspoken and unacknowledged factors which feed into academic misconduct. I had not considered, for example, that in some cultures it is a sign of respect to directly quote others, rather than being taken as plagiarism as it is in UK HE.
  • Lighting Talks on the significance of graduate attributes, an evaluation of the impact of mindset interventions, and the impact of specialist academic tutors from my colleagues at our London Campus.
  • Technology in Advising SIG, by Pete Fitch of UCL who led a discussion on what technologies we are using to support tutoring, including learning management systems, ePortfolios, and bespoke timetabling and appointment booking systems.
  • Understanding Student Finances, by Charmaine Valente from the Student Loans Company who talked about the current loans system in England and Wales which is helpful for PATs to know about to be able to inform students.
  • Academic Coaching at the University of Wolverhampton, by James Jennings who talked about the dedicated academic coaches they are employing at Wolverhampton to provide dedicated support and pastoral care for students.
  • Critical Thinking and Tutoring, by George Steele of Ohio State University which was an interesting session to see some of the differences in perspective in the US system, such as students choosing an institution first without knowing what they want to major in. Part of the role of tutors there is to guide students and help them make that decision.
  • Active Listening for Effective Personal Tutoring, by Angela Newton from the University of Leeds who led an interactive session exploring and evaluating our listening skills.

Attached photos are from the opening keynote speech, George Steele from Ohio State talking about reflective thought, and examples of the Welsh language not trying very hard (I feel like I can get away with this joke by being Scottish).

Leave a Comment

Studiosity Research Outcomes

Screenshot showing improved attainment for Studiosity users
Screenshot showing improved student attainment where Studiosity was used

In this presentation Professor Liz Thomas, who has previously done impact analysis for Studiosity, presented her latest research on the experience of UK institutions using the service since it launched here in 2016/17, and now includes 22 UK HEIs.

The screenshot I’ve included above shows improved attainment rates of students who have used Studiosity versus those who did not, and looks very similar to the charts we produced here after our pilot year. Caveats abound of course. I’ve said “correlation ≠ causation” more times that I can count of late, and it is perfectly possibly that the students who engage with Studiosity would have been high achievers in any case, or would have engaged with other interventions to improve their work. But it certainly seems like there is something there, and the research also showed that in the groups of students who engaged with Studiosity, the attainment gap between white and BME students was reduced, and for one institution completely eliminated.

Other findings from the research included that 54% of usage takes place outside of conventional office hours, usage peaks in April (and on Wednesdays), and both professional and academic staff reported a benefit of the service as being able to refer students to a specialist service which freed up time for them to concentrate on other areas.

One point of discussion was around low engagement and how this can be improved. It was noted that students need the opportunity to be able to include a draft submission to Studiosity in good time, and it was suggested that use of Studiosity be built into assessments to allow for this. This very much echoes the findings of my colleague in our Faculty of Health, Science and Wellbeing, Jon Rees, who wrote about his experience on the University’s Practice Hub.

Leave a Comment

ALT North East User Group: March 2023

Various responses on Padlet showing our thoughts on AI. It's a tad negative.
A screenshot from Padlet showing our thoughts on generative AI. It’s a tad negative.

We’re getting back into a stride now, with the second meeting of the academic year at Teesside. After introductions and updates from each of the core university groups, Malcolm from Durham kicked us off with a conversation about Turnitin and how we all feel about it. From a survey of the room, most of us seem to be using it rather apathetically, or begrudgingly, with a few haters who would love to be able to do away with it, and no-one saying they actively like the service. Very revealing. So why do we all keep on using it? Because we all keep on using it. Turnitin’s database of student papers pulls like a black hole, and it will take a brave institution to quit the service now. Of note was that no-one really objected to the technology itself, especially originality reporting, but rather their corporate disposition and hegemonic business model.

Emma from Teesside then talked about their experience of being an Adobe Creative Campus, which involves making Adobe software available to all staff and students, and embedding it into the curriculum. Unfortunately, Emma and other Teesside colleagues noted the steep learning curve which was a barrier to use, and the fact that content had to sit on Adobe servers and was therefore under their control.

Next up was my partner in crime, Dan, reporting on Sunderland’s various efforts over the years to effectively gather student module feedback. This was a short presentation to stimulate a discussion and share practice. At Newcastle they have stopped all module evaluation, citing research on, for example, how female academics are rated lower than male. This has been replaced with an ‘informal check’ by lectures asking students how the module is going, are you happy, etc. They are being pushed to bring a formal system back due to NSS pressures, but are so far resisting. At Durham they are almost doing the opposite, with a dedicated team in their academic office who administer the process, check impact, and make sure that feedback is followed up on.

Finally after lunch, we had a big chat about that hot-button issue that has taken over our lives, the AI revolution! It was interesting for me to learn how Turnitin became so dominant back in the day (making it available to everyone as a trial, and getting us hooked…), and the parallels which can be drawn with their plans to roll out AI detection in the near future. Unlike their originality product which allows us to see the matches and present this to students as evidence of alleged plagiarism, we were concerned that their AI detection tool would be a black box, leaving wide open the possibility of false accusations of cheating with students having no recourse or defence. I don’t think I can share where I saw this exactly, but apparently Turnitin are saying that the tool has a false positive rate of around 1 in 100. That’s shocking, unbelievable.

No-one in the North East seems to be looking at trying to do silly things like ‘ban’ it, but some people at Durham, a somewhat conservation institution, are using it as a lever to regress to in-person, closed-book examination. Newcastle are implementing declarations in the form of cover sheets, asking students to self-certify if / how they have used AI writing.

There were good observations from colleagues that a) students are consistently way ahead of us, and are already sharing ways of avoiding possible detection on TikTok; and b) that whatever we do in higher education will ultimately be redundant, for as soon as students enter the real world they will use whatever tools are available in industry. Better that we teach students how to use such tools effectively and ethically in a safe environment. As you can see from the Padlet screenshot above, our sentiments on AI and ChatGPT were a tad negative.

Leave a Comment

Teaching with ChatGPT: Examples of Practice

Some examples of what ChatGPT is, and isn't; it is a large language model, it is not sentient!
Screenshot from one of the presentations outlining what ChatGPT is and is not: it is not human, not sentient, and not reliable!

This session on the robot uprising was facilitated by the University of Kent, and in a welcome contrast to some of the other sessions I have been to on AI recently, this was much more positive, focusing on early examples of using ChatGPT to enhance and support teaching and the student experience.

Some highlights were Maha Bali from the American University in Cairo who argued that we need cultural transparency around this technology as people are going to use it regardless of whatever regulations are put in place. This was echoed by some of the other presenters who noted that after graduation, when students enter industry, they will use, and be expected to use, any and all available relevant technologies. Someone else in the chat also noted that if you ban AI writing at university, then one outcome is going to be that students will only use it for cheating. So good luck, Cambridge. On transparent, ethic use, Laura Dumin from the University of Central Oklahoma talked about a new process they have implemented which asks students to declare if they have used AI tools to help with writing, and highlight which text has been AI generated so academics can clearly see this.

Some presenters had suggestions around re-focusing assessments along the lines of what ChatGPT can’t do, but which humans can. Some of these I feel are short term solutions. One person, for example, talked about how ChatGPT is generally better at shorter pieces of writing, so they have changed their assessments from 3x 800 word assessments throughout the year to 1x 2,000. Debbie Kemp at Kent suggested asking students to include infographics. I think these suggestions are going to work for now, but not in the long term. And the long term here isn’t even very long, given the pace of technological developments. By the time you could get changes to assessment through a programme board and in place for students, the technology may well have rendered your changes moot.

I think a better idea is around including more critical reflection from students. Margaret Bearman from Deacon University in Australia made the point that AI is not good at providing complex, context sensitive value judgements, and that I think is going to be a harder barrier for AI to overcome. Neil McGregor at the University of Manchester talked about this in a slightly different form. Instead of having students write critical reflections, they are now generating those with ChatGPT and asking the students to analyse and critique them – identifying what parts of the AI text they agree with, and where are the weaknesses in the arguments presented.

All of these sessions were recorded and are available on YouTube.

1 Comment

UK HE’s Thoughtful Response to Robot Writing

Screenshot of three Borg drones from Star Trek
Am I implying an equivalence between The Borg and ChatGPT?

There’s no escaping the robots, resistance is futile. ChatGPT has been a gathering storm since the back end of last year, and Sunderland cannot escape the pull. However, we need to learn more about this and related technology in order to be able to be able to provide a thoughtful and measured response to it for our staff and students. To which end, I signed up for this session drawing together senior academics from across UK HE to share thoughts and experience. I have a few more such sessions coming in the next few weeks, so I’ll wait and share my thoughts in a dedicated post when I have the time and space to synthesis them.

Leave a Comment