Press "Enter" to skip to content

Month: November 2023

ALT NE User Group: November 2023

My turn to do the hosting honours today, for the first time since before we had that pesky pandemic. My carefully planned agenda went completely out the window during the first item, but everything still managed to run pretty smoothly, and splurging the boss’s cash on the catering after getting the venue for free was a result, as the food was roundly praised.

We began with institutional updates from attendees. I thought as it was the first meeting of the year a quick round of updates would be good to have. I asked for one slide or five minutes each, got something like 17 slides from one bod, and this half hour item ran to well over an hour. But it was good, and I learned that we are all dealing with the problem of digital skills of staff and trying to make improvements there, and what the Blackboard and Anthology merger has done for AI in Blackboard. Staff now have access to an AI Design Assistant which will create entire course outlines and structures which serves as a great starting point. Middlesbrough College are trialing Microsoft’s Copilot tool in Bing, and have made it available to all staff and students, and Newcastle have seen a big increase in digital exams which are now at 40%.

After the roundup, I had one of my team do a demo of the Clevertouch boards we rolled out last year, then a learning design / content development showcase which provided an opportunity to share examples of best practice. In the afternoon we had a discussion on the role of ALT and where we sit within it, and a tour of Sunderland’s new anatomy suite. We have a new Anatomage table with a number of additional models, including some fine detail scans which have digitised certain features down to 0.1mm.

Leave a Comment

The End is Not Nigh


Pecuniam populo antepone

Yesterday I had the dubious pleasure of catching a bit of Rishi Sunak’s chat with Elon Musk about the future of AI, and it was dreadful. Absolutely no criticality whatsoever, Sunak just blindly accepted everyone Musk told him. This is something which bothers me so much that over the past few months I sort of accidently wrote 2,500 words on why the robots will not be taking over anytime soon, but instead of publishing it here I sent it on to the ALTC Blog for consideration, and it was published today – you can read it here. I should think of the ALTC Blog more often and try to get more of my ramblings published there, it’s been a while. They even gave me a badge.

Anyway, the short, short version is that no matter how impressive ChatGPT may seem, it’s not doing anything very new or revolutionary, and that particular kind of artificial intelligence has pretty much gone as far as it can. There is absolutely no path from where we are today to general artificial intelligence which can rival or surpass human intelligence. None. Whatsoever. The real threat of AI we should be worried about is how it is being used to displace and make precarious workers in certain industries to further increase the capture of wealth by the top 1%. This is one of the issues which SAG-AFTRA are striking on, specifically the practice of replacing background extras in film and TV with AI generated images. This is the time to be fighting back and supporting campaigns like this, because our politicians are certainty not up to the challenge, even if it does mean you have to wait an extra few months for Dune: Part 2.

ALRC Blog Contributor Digital Badge

Leave a Comment

UoS Learning and Teaching Conference 2023

Learning and Teaching Conference, 2023
The big boss up on stage, doing the introductions

Another out of this world conference this year, but alas nobody who was one degree of separation from walking on the moon this time, as our attention instead turned to… yes, you guessed it, generative artificial intelligence.

The morning keynote was given by Thomas Lancaster of Imperial College London who has done a lot of research over the years on contract cheating, and who has now turned his attention to the new AI tools which have appeared over the past year. Interestingly, he commented that essay mill sites are being pushed to students as much as they ever have, but I suspect that these agencies are now themselves using generative AI tools to displace already low paid workers in the developing world who were previously responsible for writing assignments on demand for Western students.

The first breakout session I attended was ‘Ontogeny: Mentoring students to succeed in a world of AI’ by Dr Thomas Butts and Alice Roberts who discussed how medical students are using GAI and the issues this is causing in terms of accuracy, as these models are often presenting wrong information as truth, which has particularly serious consequence in medicine. There was an interesting observation on culture and social skills, that students now seem to prefer accessing the internet for help and information rather than simply asking their teachers and peers.

The second session was ‘Enhancing the TNE student experience through international collaborative discussions and networking opportunities’ by Dr Jane Carr-Wilkinson and Dr Helen Driscoll who discussed the Office for Students’ plans to regulate TNE (trans-national education), though no-one quite seems to know how they are going to do this. Including the OfS. This was an interesting discussion which explored the extent of our TNE provision (I don’t think I had appreciated the scale before, over 7,000 students across 20 partners), and the issues involved in ensuring quality across the board.

There was also a student panel discussion who were asked about their use of GAI and understanding of the various issues surrounding plagiarism. They demonstrated quite a robust level of knowledge, with many of them saying that they are using ChatGPT as a study assistant to generate ideas, but I did groan to hear one person talk about the "plagiarism score" in Turnitin and how "20% plagiarism is a normal amount", and they don’t worry until it gets higher. The myths penetrate deep.

The final afternoon keynote was given by Dr Irene Glendinning of Coventry University who talked about her research on the factors which lead to plagiarism and cheating. This included a dense slide on various factors such as having the opportunity, thinking they won’t be detected, etc., but nowhere on there were cultural factors identified, and the way that higher education in the UK has been marketized over the recent past. I’ve certainly came across comments along the nature of, if students are paying £9,000 a year on tuition, why not just pay a few hundred more to make assessment easier or guarantee better results? But I’m noticing more and more that people don’t seem to be willing or able to challenge the underlying political decisions anymore.

Leave a Comment