Press "Enter" to skip to content

Sonya's Blog Posts

ALT NE User Group: March 2024

GIF of Jonny 5 reading a book really fast
Now this is the kind of AI I was promised as a kid

The latest ALT North East User Group was hosted at Middlesbrough College, and had a very generative AI heavy agenda. But first, Tamara at Middlesbrough presented on ‘ED Tech and Pedagogy’ which was quite similar to a TEL and pedagogy session I do on our PG Cert, and I picked up a few points that I can integrate into future presentations. Including the argument that it is really Gen Z who are the first true digital natives which will be useful as I still use Prensky’s original talk to explore the idea that different generations approach technology differently.

Next we had a round robin session on how we are approaching AI at our respective institutions. I talked about the in-year changes we made to student regulations in response to the release of ChatGPT, something Middlesbrough College have also done, and Northumbria are using a cover sheet template for student assignments for them to delicate if and how they have used AI to help with their work. Quite a few of us are pressing forwards with Microsoft Co-Pilot now that it is available.

Ross from Durham then presented on an AI chatbot they have created using Cody AI to assist students on a large module where, for various reasons, information is located in different places, including Blackboard and SharePoint. Cody looks interesting. It’s using various models under the hood, I’m sure Ross said models from multiple provides were available, but I only saw OpenAI based ones in their demo. You train the chatbot on your own data which you upload to Cody, and sharing that data and use of the model back with OpenAI is allegedly opt-in. (Perhaps I’m being overly cynical, but I wouldn’t OpenAI on this.)

Finally, after lunch, I presented on something not AI, but EDI – the Equality, Diversity and Inclusion Portal which I have created at Sunderland in partnership with our EDI team in an effort to widen access to our various EDI educational resources.

Leave a Comment

Exploring Modality in the Context of Blended and Hybrid Education

It’s come to my attention, because I’ve just been writing about this for my CMALT portfolio review, that I don’t always record HeLF webinars on my CPD record, so here I am, doing just that. The ‘Heads of eLearning Working in UK HE’ forum facilitates regular CPD webinars for its members, and this one was exploring different kinds of attendance in a post-pandemic context.

Simon Thomson, of the University of Manchester, began with a discussion on how they have previously used the TPACK Framework in academic development, but found that people often got too caught up in the technology aspect to the exclusion of other factors. Simon has therefore adapted this model, replacing ‘technology’ with ‘modality’ to create the ‘Subject, Pedagogy & Modality’ Framework, or SPAM, instead. The models are captured in the first screenshot taken from the presentation, above. This led into a discussion on the rationale and value of specific modalities, and confusion over terminology. From the second screenshot, the idea of student choice resonated with me. I think it is very much the wrong tack when institutions, or worse, the government, dictate how students should be learning for non-pedagogical reasons. (Like checking visa compliance for example…!)

Sue Buckingham, from Sheffield Hallam, picked up on the confusing terminology in their part of the presentation. How many students would be able to confidently define ‘HyFlex’ learning for example, or explain the different between blended, hybrid, and hyflex? Could you? Could I!? HyFlex is exactly what I’ll be doing when my own module starts up again next week. It’s all been planned and designed to be in person, but I’m also going to stick a laptop at the front, pointed at me and the board, and have a concurrent Teams session running too. Students in 2024 have rich, complex lives. Jobs, school runs, caring commitments, so give them a choice as a reasonable accommodation and act of compassion.

Leave a Comment

ALT NE User Group: November 2023

My turn to do the hosting honours today, for the first time since before we had that pesky pandemic. My carefully planned agenda went completely out the window during the first item, but everything still managed to run pretty smoothly, and splurging the boss’s cash on the catering after getting the venue for free was a result, as the food was roundly praised.

We began with institutional updates from attendees. I thought as it was the first meeting of the year a quick round of updates would be good to have. I asked for one slide or five minutes each, got something like 17 slides from one bod, and this half hour item ran to well over an hour. But it was good, and I learned that we are all dealing with the problem of digital skills of staff and trying to make improvements there, and what the Blackboard and Anthology merger has done for AI in Blackboard. Staff now have access to an AI Design Assistant which will create entire course outlines and structures which serves as a great starting point. Middlesbrough College are trialing Microsoft’s Copilot tool in Bing, and have made it available to all staff and students, and Newcastle have seen a big increase in digital exams which are now at 40%.

After the roundup, I had one of my team do a demo of the Clevertouch boards we rolled out last year, then a learning design / content development showcase which provided an opportunity to share examples of best practice. In the afternoon we had a discussion on the role of ALT and where we sit within it, and a tour of Sunderland’s new anatomy suite. We have a new Anatomage table with a number of additional models, including some fine detail scans which have digitised certain features down to 0.1mm.

Leave a Comment

The End is Not Nigh


Pecuniam populo antepone

Yesterday I had the dubious pleasure of catching a bit of Rishi Sunak’s chat with Elon Musk about the future of AI, and it was dreadful. Absolutely no criticality whatsoever, Sunak just blindly accepted everyone Musk told him. This is something which bothers me so much that over the past few months I sort of accidently wrote 2,500 words on why the robots will not be taking over anytime soon, but instead of publishing it here I sent it on to the ALTC Blog for consideration, and it was published today – you can read it here. I should think of the ALTC Blog more often and try to get more of my ramblings published there, it’s been a while. They even gave me a badge.

Anyway, the short, short version is that no matter how impressive ChatGPT may seem, it’s not doing anything very new or revolutionary, and that particular kind of artificial intelligence has pretty much gone as far as it can. There is absolutely no path from where we are today to general artificial intelligence which can rival or surpass human intelligence. None. Whatsoever. The real threat of AI we should be worried about is how it is being used to displace and make precarious workers in certain industries to further increase the capture of wealth by the top 1%. This is one of the issues which SAG-AFTRA are striking on, specifically the practice of replacing background extras in film and TV with AI generated images. This is the time to be fighting back and supporting campaigns like this, because our politicians are certainty not up to the challenge, even if it does mean you have to wait an extra few months for Dune: Part 2.

ALRC Blog Contributor Digital Badge

Leave a Comment

UoS Learning and Teaching Conference 2023

Learning and Teaching Conference, 2023
The big boss up on stage, doing the introductions

Another out of this world conference this year, but alas nobody who was one degree of separation from walking on the moon this time, as our attention instead turned to… yes, you guessed it, generative artificial intelligence.

The morning keynote was given by Thomas Lancaster of Imperial College London who has done a lot of research over the years on contract cheating, and who has now turned his attention to the new AI tools which have appeared over the past year. Interestingly, he commented that essay mill sites are being pushed to students as much as they ever have, but I suspect that these agencies are now themselves using generative AI tools to displace already low paid workers in the developing world who were previously responsible for writing assignments on demand for Western students.

The first breakout session I attended was ‘Ontogeny: Mentoring students to succeed in a world of AI’ by Dr Thomas Butts and Alice Roberts who discussed how medical students are using GAI and the issues this is causing in terms of accuracy, as these models are often presenting wrong information as truth, which has particularly serious consequence in medicine. There was an interesting observation on culture and social skills, that students now seem to prefer accessing the internet for help and information rather than simply asking their teachers and peers.

The second session was ‘Enhancing the TNE student experience through international collaborative discussions and networking opportunities’ by Dr Jane Carr-Wilkinson and Dr Helen Driscoll who discussed the Office for Students’ plans to regulate TNE (trans-national education), though no-one quite seems to know how they are going to do this. Including the OfS. This was an interesting discussion which explored the extent of our TNE provision (I don’t think I had appreciated the scale before, over 7,000 students across 20 partners), and the issues involved in ensuring quality across the board.

There was also a student panel discussion who were asked about their use of GAI and understanding of the various issues surrounding plagiarism. They demonstrated quite a robust level of knowledge, with many of them saying that they are using ChatGPT as a study assistant to generate ideas, but I did groan to hear one person talk about the "plagiarism score" in Turnitin and how "20% plagiarism is a normal amount", and they don’t worry until it gets higher. The myths penetrate deep.

The final afternoon keynote was given by Dr Irene Glendinning of Coventry University who talked about her research on the factors which lead to plagiarism and cheating. This included a dense slide on various factors such as having the opportunity, thinking they won’t be detected, etc., but nowhere on there were cultural factors identified, and the way that higher education in the UK has been marketized over the recent past. I’ve certainly came across comments along the nature of, if students are paying £9,000 a year on tuition, why not just pay a few hundred more to make assessment easier or guarantee better results? But I’m noticing more and more that people don’t seem to be willing or able to challenge the underlying political decisions anymore.

Leave a Comment

Block Teaching Experience at UoSiL

Scoreboard showing team scores from gamification session
“What’s on the board, Miss Ford?”

Last week I was down at our London Campus for block teaching of my module, Designing Learning and Assessment in Higher Education. Last year our students at London were part of the main cohort, but this year due to numbers we arranged to deliver the PG Cert as a ‘block’ over a couple of weeks. To share the workload, teaching was split between myself and my counterpart of the other module who travelled down for a few days, and teaching staff at London with relevant experience. It was interesting for me to see different perspectives as a result of London staff picking up some of these sessions. On one of the sessions, ‘Academic Identity and Everyday Writing in the Workplace, I learned about the concept of teaching journals, a reflective exercise to capture “observations, reflections, and other thoughts about teaching” (Richards and Farrell, 2010). Interestingly, I find that on reflection I have been doing this all along without realising it – for every occurrence of every module I have taught, I have kept a running list of things which I have learned, reflections about things which worked particularly well (or didn’t), and ideas about things to change to improve the module for future cohorts. However, in the spirit of the concept I am attempting to put this into more formal practice with this post.

In additional to discovering this concept and getting to see some of my London colleagues in action, I also learned about Class VR which is a virtual reality system they have bought. The headsets are a little basic, but the key concept here is that you have a managed service which can push content to all of the headsets in the class. It’s a great idea, I really liked it. Unfortunately their experience with it has been more miss than hit, with headsets often failing to connect to the server and requiring a reset. Indeed, for our demo all three of the headsets they brought along failed to connect.

Of the sessions I taught myself, ‘Gamification and Game Based Learning’ went well. I’ve ran this for a number of years now as part of different modules, and I feel like it’s well polished and we always get good feedback about this one. The screenshot above is the final scoreboard from Keep the Score, one of the supplementary tools I recommend. The session around assessment and modern forms of academic misconduct (inc. generative AI) also ran well and provoked some interesting and lively discussion. Finally, ‘The Biscuit One’. Adapted from the work of Sambell, Brown and Race (2012), this was a highly impactful activity for me when I was a student on the PG Cert in 2017 and one I pushed to include when this module was revamped and I took over as module leader. The central idea is to teach people about creating rubrics and exploring some of the difficulties in marking, such as grade boundaries, using the metaphor of ‘what is a biscuit?’ The academic who used to run this at Sunderland left us last year, so for the past two iterations of the PG Cert I’ve ran this session myself. It’s been okay, but I don’t think I do it as well as they used to. In both occasions I feel I’ve been rather unlucky in having two groups come up with a definition for a biscuit that was so broad and encompassing that virtually all of the biscuits provided were included. I haven’t worked out how to deal with that yet, but I’ll need to think of something for February.

Leave a Comment

AI in Education: Unleashing Creativity and Collaboration

Word cloud showing positivity towards AI
Word cloud showing some positivity towards AI

This was the University of Kent’s third Digitally Enhanced Education webinar on the topic of AI in education, this time with a focus on how AI can be used positively to support creativity and collaboration. An open poll on our thoughts ran throughout the afternoon, and as you can see from the screenshot above the group was far more optimistic about it all than us doom-saying learning technologists at ALT North East. All of the presentations were recorded and are available on their YouTube channel.

A few themes stood out for me. On how GAI is impacting students, Dr Sam Lau of Hong Kong Baptist University talked about a student survey they have done in which students talked about how they are starting to use GAI tools as a new, ‘better’ type of search engine and teaching assistant. Cate Bateson, Hannah Blair and Clodagh O’Dowd, students at Queen’s University Belfast, reported that students want clarity and guidance from their institutions on where and how they are allowed to use AI tools. This was echoed by Liss Chard-Hall, a study skills tutor, who said that students have reported to her a new reluctance to use tools which already were using AI before ChatGPT, such as Grammarly, because they aren’t sure if it’s allowed by their institution. One person in the chat even commented that they knew of a student who was scared to use the spelling and grammar checker in Word lest they break new university rules about using AI in assessment.

Also from the chat, there was a discussion about which areas of university life are going to be most disrupted. Literature reviews was a big one, as what benefits are there from conducting a complex, time consuming search of literature when you can ask an AI model to do it for you? To which end, I learned about a new tool that claims to be able to do just this: Elicit. Another useful discovery from this session is This Person Does Not Exist which generates photo-realistic images of people.

On impacts in the wider world, Dr Ioannis Glinavos of the University of Westminster made the case that jobs in many areas will become more about verifying information and accuracy, as has happened with translators in the past couple of decades. While it is still a necessary skill, and possible to make a living as a translator, machine translation does the bulk of the work now, with human translators doing post-editing and checking for contextual and cultural relevance.

Finally, Anna Mills from the College of Marin in the US brought ethical questions back to the fore. First, reminding us that these new GAI models are designed to output plausible sounding responses, not truth – they don’t care about truth – hence we all need to be mindful to verify any information sourced from GAI tools. Anna then talked about two facets of “AI colonialism”. First, that GAI models are primarily trained on source texts written in the West (and as we know from stats about who writes Wikipedia articles and Reddit posts for example, we can infer a predominance of certain genders and skin colours too – biases feeding biases…), and second, that content moderation is being outsourced to low paid workers in the developing world, an inconvenient truth that isn’t getting enough attention. Anna’s presentation is available under a CC license and is well worth reading in full.

Leave a Comment

ALT NE User Group: June 2023

A photo of Durham's lightboard in action
Durham University’s Lightboard, a very cool (but smudgy) piece of tech

Hosted by my lovely colleagues at Durham, this ALT North East meeting began with a discussion of the practice of video assessment. I talked through what we do at Sunderland using Canvas and Panopto, covering our best practice advice and talking through the things which can go wrong. The problem of a VLE having multiple tools for recording / storing video was one such headache shared by all of us, no matter what systems we are using.

We then moved on to a discussion about Turnitin, ChatGPT and AI detection, pretty much a standing item now. Dan shared with us a new tool he has come across, which I’m not going to name or share, which uses AI to autocomplete MCQs. A new front has emerged. Some bravery from Northumbria who must be one of the few HEIs to have opted in to Turnitin’s beta checker, and New College Durham are going all in on the benefits of generative writing to help staff manage their workload by, for example, creating lesson plans for them. A couple of interesting experiments to keep an eye on there.

After lunch we had demonstrations of various tools and toys in Durham’s Digital Playground Lab. This included a Lightboard. This is a really cool and simple piece of tech that lets presenters write on a transparent board between them and the camera using UV pens. I came across this a few years ago, before the pandemic I think, but it’s a strange beast. It’s not a commercial system, but open hardware, so anyone can build one for themselves at little cost. Unfortunately at Sunderland, and I suspect at many bureaucracies, this actually makes it a lot harder to get one than just being able to go to a supplier. So it never happened, but at least today I got to see one live.

Another bespoke system demonstrated was a strip of LED lights around the whiteboard controlled through a web app which allows students to discretely indicate their level of comprehension. We had a short tour of the Playground’s media recording room, watched some video recordings of content created in VR to, for example, show the interaction of the magnetic fields of objects, a demonstration of Visual PDE which is an open source web tool for demonstrating differential equations, and Kaptivo, a system for capturing the content of a whiteboard but not the presenter. You can see the Kaptivo camera in the background of my photo, behind the Lightboard.

Leave a Comment

Studiosity Partner Forum 2023

Studiosity usage at Sheffield
Photo of Sheffield’s Studiosity Dashboard

Attended the second Studiosity Partner Forum in London today, which had representatives from 14 UK HEIs out of the now 23 who are Studiosity users. The opening keynote was delivered by Rebecca Bunting, Vice Chancellor at the University of Bedfordshire, who talked about issues current in HE, with a focus on access and participation. She made good points on the limitations of students going to university, which includes not only things like entry requirements and location, but also what people are able to study once there and how the cost of living crisis is impacting choice. She talked about how this can impact on student retention, which HEIs are held accountable for, but there are often very good reason why students may have to leave their study. Finally, she talked about the concept of the “sticky campus” – keeping students on campus – which is something else universities are often held accountable for as a desirable thing, but which doesn’t work for students in their 30s or who have fulltime jobs, families, etc. Those students want, and need, to be on campus to do what they need for their studies and then get away again as soon as possible. At Bedfordshire, the majority of their students are over 30.

Next was a product update session from Isabelle Bristow, Studiosity’s Managing Director for the UK and Europe. The peer support service which was in early development last year will be available in July as ‘Student Connect’, in which third year students can mentor and guide first year students after training from Studiosity and the university. These mentors are paid at a rate set by the university, and all chat and calls are managed through Studiosity to ensure privacy and confidentiality. Unfortunately this isn’t something we will be able to explore at Sunderland, as we are continuing to keep Studiosity focused at IFY and new undergraduates. Isabelle also talked about a new Writing Feedback feature which will help students to identify where they have used higher order thinking skills – at least in part designed to counter and mitigate the use of generative AI writing.

Simon Reade and Matthew Hare from Sheffield Hallam University then presented on their data dashboard which uses data from the Studiosity API and other sources, and outputs to Tableau. One such chart, showing usage changes over a number of years, is shown (badly) in the photo above. This was a very interesting session for me, as we have just done this ourselves using Power BI. Some of their findings / experience felt very familiar – high usage in Health subjects, low in their Business, Technology and Engineering College (strange bedfellows, but our Business folks can also be hard to engage with new technology and interventions). Another observation they made was that Studiosity seems to hit more demographic groups than those which traditionally access support services, a good thing.

After lunch, Dr Andy Gould from SOAS talked about how they are responding to AI which led into an open discussion. Andy referenced Jisc, who in their response said that a crisis could be used as a driver for change, similar to what I and others said about the pandemic response. The problem is the sector seems to be in perma-crisis. They have co-created a student guide containing a list of ‘dos’ and ‘don’ts’ as best practice. Andy also talked about the idea of academics using ChatGPT to write student feedback, something students were very against, unsurprisingly, and finally noted that some students have reporting using a paraphrasing service I won’t name to try and ‘launder’ AI produced writing.

Other random points and observations made throughout the day in discussions with colleagues included a note from one institution that has seen Studiosity seemingly widen their participation gap, possibly as a result of higher achieving students engaging with the service to a greater extent. Much of our discussions were on the nature of students wanting to have a personal connection when it comes to seeking support, something Studiosity delivers well, and which may indicate strong use of the new Student Connect service when it goes live. Referencing was noted as by far the most in demand area for support, and again something that may draw them to peer support. Finally, there was a comment about how in some subject areas, such as engineering, students may not get any conventional written assignment until their 3rd year, with 1st and 2nd year assignments focusing on group work. This is an important point for me, and Sunderland, to be aware of as it may help to explain weak uptake in certain areas.

Leave a Comment

Using AI in Education: A Student Voice

Screenshot showing different results for weather in Egypt on Google and ChatGPT
Screenshot of a search query in Google and ChatGPT

A second session from the University of Kent on AI / ChatGPT, this time student-led. It was good to hear the student voice on these developments, and I found it reassuring that they are identifying the same issues and raising the same concerns as staff. More than one of the student presentations talked about how ChatGPT is already displacing Google and other search engines as the first place they are going to find answers. Like in the screenshot above, which shows the difference in results when searching for the temperature in Egypt from Google on the left, where you get a list of links to follow through, and ChatGPT on the right which provides a far more detailed answer in a written form. The problem is, as identified in one of the presentations but not the other, is that there is no way to verify the data which ChatGPT is presenting as truth. With the Google results you can evaluate the sources and verify against others; ChatGPT is a black box.

There was another good presentation from a student at Northumbria who has done some early research with students who have used ChatGPT to find out why and what they are using it for. The results being that they are mostly using it to check their own knowledge (problematic if you can’t trust ChatGPT to give you true answers), and to generate ideas. They are also using it outside of education to, for example, help write CVs and job applications. This makes me feel like we are always going to be on the defensive, reactive as one student said. While the education sector tries to grapple with the technology and debates banning it on one hand and thoughtful ethical use – while protecting academic integrity – on the other, other areas of society plough on regardless and heedless.

In another presentation a student talked about experimenting with ChatGPT to produce a response to one of their assignments and, echoing Margaret Bearman’s point from the teacher led session last month, they found that the result lacked critical analysis, and believe it would have been a clear fail.

I was pleased to note that ethical issues were being raised by students, though largely in the context of equitable access. With GPT 4 and priority access going behind a paywall, the students who can afford to use it will, and those who can’t will find they have another new and innovative way of being disadvantaged. How long before we see the first university or college purchasing a license for all of their staff and students?

Once again all of the presentations were recorded and are available as a YouTube playlist.

Leave a Comment