Press "Enter" to skip to content

Tag: Language

Teaching with ChatGPT: Examples of Practice

Some examples of what ChatGPT is, and isn't; it is a large language model, it is not sentient!
Screenshot from one of the presentations outlining what ChatGPT is and is not: it is not human, not sentient, and not reliable!

This session on the robot uprising was facilitated by the University of Kent, and in a welcome contrast to some of the other sessions I have been to on AI recently, this was much more positive, focusing on early examples of using ChatGPT to enhance and support teaching and the student experience.

Some highlights were Maha Bali from the American University in Cairo who argued that we need cultural transparency around this technology as people are going to use it regardless of whatever regulations are put in place. This was echoed by some of the other presenters who noted that after graduation, when students enter industry, they will use, and be expected to use, any and all available relevant technologies. Someone else in the chat also noted that if you ban AI writing at university, then one outcome is going to be that students will only use it for cheating. So good luck, Cambridge. On transparent, ethic use, Laura Dumin from the University of Central Oklahoma talked about a new process they have implemented which asks students to declare if they have used AI tools to help with writing, and highlight which text has been AI generated so academics can clearly see this.

Some presenters had suggestions around re-focusing assessments along the lines of what ChatGPT can’t do, but which humans can. Some of these I feel are short term solutions. One person, for example, talked about how ChatGPT is generally better at shorter pieces of writing, so they have changed their assessments from 3x 800 word assessments throughout the year to 1x 2,000. Debbie Kemp at Kent suggested asking students to include infographics. I think these suggestions are going to work for now, but not in the long term. And the long term here isn’t even very long, given the pace of technological developments. By the time you could get changes to assessment through a programme board and in place for students, the technology may well have rendered your changes moot.

I think a better idea is around including more critical reflection from students. Margaret Bearman from Deacon University in Australia made the point that AI is not good at providing complex, context sensitive value judgements, and that I think is going to be a harder barrier for AI to overcome. Neil McGregor at the University of Manchester talked about this in a slightly different form. Instead of having students write critical reflections, they are now generating those with ChatGPT and asking the students to analyse and critique them – identifying what parts of the AI text they agree with, and where are the weaknesses in the arguments presented.

All of these sessions were recorded and are available on YouTube.

1 Comment

Introduction to British Sign Language

bsl_sonya

Ah, some proper CPD! An intense three hour introduction to deaf awareness and British sign language taught by Robin Herdman with the aid of two interpreters, and a welcome change from the usual half hour webinar with a salesperson which I seem to have done a lot of lately.

The awareness aspect alone was packed. Important snippets I hastily noted are that BSL is the 4th officially recognised language in the UK, that it is used by 125,000 adults in the UK, though there are 11 million deaf or hard of hearing people in the country, that it has a different grammar from English, that it differs significantly from American sign language which is partially derived from French sign language, that BSL has regional dialects, particularly with numbers and colours, that evidence of the use of sign language in the UK can be traced as far back as 650 CE, and that deaf teachers and interpreters are in increasingly short supply, which has consequent effects on deaf people being able to access education, health and social care.

From the practical side of the session I learned that lip reading is very ineffective, with only around a 30% comprehension rate, the remaining 70% being guess work from context. Therefore BSL is much preferred. I learned the importance of facial expressions and non-manual features, a number of phrases for basic communication, and, in theory, the alphabet. There are some nice hooks in the alphabet which gives me hope that I’ll remember most of it a few months down the line, such as the vowels which correspond to each finger – ‘a’ being your thumb and ‘u’ your pinky – and the ‘s’, ‘n’, and ‘y’ from my name.

Leave a Comment