Press "Enter" to skip to content

Tag: Studio

Navigating the Future: Innovation and Integrity in the Era of AI

I was at St James’ Park today, I believe the local football fans are rather fond of the place, but I was there for Turnitin’s first roundtable discussion since before the pandemic. Trying to start this post with ‘not AI’, we had a look at Turnitin’s product roadmap which is all about the new Feedback Studio. The new version has been redesigned from the ground-up to be screen reader accessible, a common complaint about the old version, and to be fully responsive, rather than Turnitin developing mobile apps for the platform. The rubric manager has also been rewritten to make improvements in managing and archiving rubrics, and adding the ability to import rubrics from common file formats like Excel, rather than the previous propriety format they used. It goes live on July 15th, but institutions can opt-out, and they are expecting a long period of transition. Alas that we are switching to the Canvas framework integration so our staff won’t benefit from this.

And that’s about it for ‘not AI’. In the opening remarks Turnitin presented on the outcomes of a global staff and student survey on perceptions of generative artificial intelligence. Overall, 78% of respondents were positive about the potential of AI, while at the same time 95% believed that AI was being misused. Among students only, 59% were concerned that an over-reliance on AI would result in reduced critical thinking skills (I have thoughts on this that I’ll circle back to later). In the slightly blurry photo above (I was sat at the back) you can see the survey results broken down by region, showing that in the UK and Ireland we are the least optimistic about AI having a positive impact on education, at only 65%, while India has the most positive outlook at 93%. All regions report being overwhelmed by the availability and volume of AI, which is unsurprising when every application and website is adding spurious AI tools to their services in a desperate attempt to be The One that sticks and ends up making a profit. (Side note to remind everyone that no-one is making any money out of actual AI systems in the current boom, these large language models are horrifically expensive to train and run, and the whole thing is being sustained by investment capital in a huge gamble on future returns. What could possibly go wrong!?)

The keynote address was delivered by Stephen Gow, Leverhulme Research Fellow at Edinburgh Napier University, who discussed the StudentXGenAI research project, and the ELM tool at the University of Edinburgh which is an institutionally provided front-end for accessing various language models but which has safeguards built-in to prevent misuse. Stephen reported on the mixed success of this. While it seems like a good idea, and the kind of thing I believe universities should be providing to ensure equitable access for all students, uptake has been poor, and students report that the they don’t like using the tool because the feel it’s ‘spying on them’, and would rather use AI models directly – highlighting issues of trust and autonomy. Stephen pointed us to C. Thi Nguyen’s paper ‘Trust as an Unquestioning Attitude‘ for a more detailed discussion of trust as it pertains to complex IT systems, and how trust should be viewed not as a binary, but a delicate and negotiated balance.

During our breakout roundtable discussions, my group discussed how AI is a divisive issue, people either love it or hate it, with few in the middle ground. There is some correlation along generational lines here, with younger staff and students being more positive, but it isn’t an exact mapping. One of my table colleagues reported having an intern, a young, recent graduate, who refuses to use any Gen AI systems on environmental ethical grounds, while another colleague won’t use it because they fear offloading their thinking skills to it. That was the second time such a sentiment had been expressed today, and it made me think of the parallels with the damage that social media has done to attention spans, but while that concept took a long time to enter the public consciousness (and we are barely starting to deal with the ramifications), there seems to be more voices raising the problem of AI’s impact on cognitive ability, and it’s happening sooner in the cycle, which gives me some limited optimism. Another colleague at my table also introduced me to the concept of ‘AI shaming‘, from a paper by Louie Giray.

Finally, we were given a hands-on experience of Clarity, Turnitin’s new product which provides students with a web interface for written assessments with a built-in AI chat assistant. The idea is to provide students with an AI system that they can use safely, and which gives confidence to both them and their tutors that there has been no abuse of Gen AI to write the essay. I like the idea of this, and I have advocated for Sunderland to provide clear guidance to students on what they can and can’t use, and that we should be providing something legitimate for students which would have safe guards of some kind to prevent misuse. Why, therefore, when presented with just such a solution, was I so sceptical and disappointed; unable to see anything but its flaws? Maybe the idea just doesn’t work in practice.

I was hoping to see and learn more about Clarity today, so I was very pleased that we were given this opportunity. Of course I immediately started to try and break it. I went straight in with the strawberry test, but the system just kept telling me it wouldn’t help with spelling, and directed me to write something addressing the essay question instead. I did get it to break though, first, by inserting the word into my essay and asking it to check my spelling and grammar, but after I had something written in the input window I found that it would actually answer the question directly, reporting that ‘strawberries’ is actually spelled with one r and two b’s. Fail. When I overheard a colleague at another table reporting that it seemed to be directing them to use US English spelling, I decided to experiment by translating my Copilot produced ‘essay’ into Spanish with Google Translate. Clarity then informed me that the assignment required the essay to be in English, a straight-up hallucination as there was no such instruction. What there was, as Turnitin told us, was that the system has been built on US English and can’t yet properly handle other variations and languages. There were also quite transparent on the underlying technology which is based on Anthropic’s Claude model, which I appreciated as I have found other companies offering AI tools to be evasive, insisting that they have developed their own models based on the own training data only, which I’m highly sceptical about given the resource requirements.

Fun as it may be to try and break AI models with spelling challenges, it’s not what they are built for, and there is an old fashioned spell checker built into the text entry box. However, that doesn’t mean that when presented with an AI chatbot in a setting like this, students aren’t going to ask it questions about spelling and grammar. This seems like a perfectly legitimate use case, and the reason I suspect that Turnitin have installed a ‘guard rail’ here is that they are well aware that large language models are no good for this kind of question, just as they are no good for mathematical operations. Or, for that matter, providing straight facts. The development of people using these models like they were search engines should frighten everyone. Our table chuckled when one of us reported that ChatGPT was confidently telling them that Nigel Farage was the Prime Minister (did I say chuckle? I meant shudder.), but more subtle errors can be far harder to spot, and could have terrible ramifications in the fractured, post-truth world we’ve built. I’m sure I’ve said something like this before on here, and I probably will again, but calling these systems ‘intelligent’ has been a huge mistake. There is no intelligence to be found here. There is no understanding. Only very sophisticated predication systems about what comes next after a given input.

I’m most doubtful about the assumptions that students will want to use Clarity in the first place. Am I giving myself aways as old when I say that I would never even contemplate writing something as important as a multi-thousand word essay in an online web interface that requires a stable, constant internet connection? Clarity has no ability for students to upload their written work, and though you can copy and paste text into it, this would be immediately flagged by Clarity as an issue for investigation. There’s no ability for versioning, no ability to export and save offline, limited formatting options and fonts, no ability to use plugins for reference management, etc. I also can’t imagine any circumstances in which I would recommend students use Clarity. It is not an infrequent problem that academics come to us reporting that they have spent hours writing student feedback in Turnitin’s Writing Feedback tool, only to find out later that their comments haven’t saved properly and just aren’t there. It is such a big problem that we routinely train our staff to write all of their feedback offline first, and then copy and paste it into Feedback Studio. Colleagues in the room challenged Turnitin about this, and the response was that in their evaluation students reported being very happy with the system.

Nevertheless, Turnitin believe that some kind of process validation is going to be necessary to ensure the academic integrity of written work going forwards, and I do think they have a point. But the only way I can see Clarity, or something like it working, is if academics mandate its use for assessment with students having to do everything in the browser, in which case unless they are teaching a module on how to alienate your students and make them hate you, it isn’t going to go down well. As much as Turnitin would like it to be so, I don’t think there’s a technological solution to this problem. I increasing think that in order to validate student knowledge and understanding we are going to have to use some level of dialogic assessment, which doesn’t scale in the highly marketised higher education system we now find ourselves in.

AI Disclaimer: There is no ethical use of generative artificial intelligence. The environmental cost is devastating and the technology is built on plagiarised content and stolen art, for the purpose of deskilling, disempowering and replacing the work of real people.
Leave a Comment

Innovation and Integrity in the Age of AI

I don’t usually attend these Turnitin product updates, not out of a lack of interest, just because it’s something that lies more with the other half of the team here at Sunderland, so I leave them to it and to cascade what’s important to the rest of us when required. This one piqued my interest though, after seeing a preview of the new user interface at NELE last week. You can see some of the planned changes to the Feedback Studio and the Similarity Report view above. I asked a question about the lack of audio feedback following NELE, and was told that this, along with new video feedback capabilities are on the roadmap and coming soon.

I was also interested in their new Clarity tool, which will allow students to submit or write their work through a web interface, and get immediate feedback with help on how to improve their writing from Turnitin’s AI chatbot. Very similar to how Studiosity’s Writing Feedback+ service works, so that’s going to be very interesting for me to see how that develops.

AI Disclaimer: There is no ethical use of generative artificial intelligence. The environmental cost is devastating and the technology is built on plagiarised content and stolen art, for the purpose of deskilling, disempowering and replacing the work of real people.
Leave a Comment

Turnitin UK User Summit

student_feedback

Attended the afternoon sessions of Turnitin’s UK user summit which focused on customer experience, with talks from colleagues at the University of Edinburgh, the University of East London, Newcastle University and the University of Huddersfield. It’s always cathartic to hear your colleagues sharing their tales of woe and horror which are so familiar in your own work, like the academics who insist on treating the originality score as sacrosanct when making a plagiarism decision, but more productively there were some really good ideas and pieces of best practice shared. One colleague was using Blackboard’s adaptive release function to hide the Turnitin assignment submission link until students had completed a ‘quiz’ which was simply making them acknowledge in writing that they work they were about to submit was all their own. A couple of people presented their research findings on what students wanted from feedback, such as in the attached photo which shows a clear preference for electronic feedback. Someone made a product development suggestion, splitting the release of the grade and feedback in Turnitin so that students have to engage with their feedback before they get their grade. But I think my personal highlight from the day was the very diplomatic description of difficult customers as those who have ‘higher than average expectations’.

Though I missed out on the morning session due to another commitment, I was able to get the gist from networking with colleagues in-between sessions. Improvements to the Feedback Studio including the ability to embed links, multiple file upload, a new user portal which will show the most recent cases raised by people at your institution, and the development I found most interesting, the ability to identify ghost written assignments. This is still quite away from being ready, but it’s an increasing problem and one Turnitin has in their sights. They couldn’t reveal too much about how this will work for obvious reasons, but the gist is that they will attempt to build up a profile of the writing style of individuals so that they can flag up papers which seem to be written differently.

The Twitter conversation from the summit is available from the TurnitinUKSummit hashtag, where you will see I won the Top Tweet! Yay me, but alas there were no prizes.

Leave a Comment

EQUELLA 6.4 Pre-Release Webinar

Attended a webinar which demonstrated new and improved features of EQUELLA 6.4 and provisional plans for the next major release, version 7. It was useful as we are a few versions behind. Some notable new things include the gallery view for items tagged as images or videos, additional options for administrators to control number of attachments allowed per item and, in the case of images, the ability to restrict the size of images (dimensions, not file size), new MIME type restrictions, and myriad improvements to the way search, sorting and filtering works.

Also demonstrated was the new ‘Push to LMS’ feature and improvements to LTI integrations making it easier to configure EQUELLA integration into Blackboard and Moodle. When we asked if these features were going to be developed for LearningStudio we were told that there were no plans for this due to lack of demand. I find it more than a touch worrying that one part of Pearson is providing better support for Pearson’s competitors than their own LMS platform. What are we to conclude about Pearson’s commitment to LearningStudio from this?

Leave a Comment

Introduction to SunSpace Storyline

intro_to_sunspace

Created a video-based presentation for new students which runs through all of the key features of SunSpace and includes a short surprise MCQ at the end to try and help reinforce their learning. Initially this was at the request of an academic who wanted something like this for some non-standard modules he has starting now, but it has wider potential so I made it generic to all SunSpace modules and then integrated it into the new module template we’ve been building for academic year 2015/16. It’s probably not complete yet, a voiceover on each slide would be nice for example, but it’s now in a state that’s good to go!

http://solar.sunderland.ac.uk/solar/file/29f3d246-59aa-4f60-9543-6c8577171de1/1/story.html

Leave a Comment

Course Dashboard Demonstration

I watched the recording of the Pearson webinar which demonstrated the new Course Dashboard this morning, the imminent replacement of the Social Learning Module Home page. I would like to say that I was excited and impressed, but the truth is that it has filled me with trepidation. I understand that the Social Learning Module Home (SLMH from now on as that is far too long a name for anything) was problematic when it was first rolled out, but I have fortunately missed that and most courses at Sunderland are, in my anecdotal experience, using SLMH in preference to the classic course home page (which looks very dated and basic now), and it works well and looks reasonably nice.

This new version however, seems like a step back. It’s blocky, it’s not a responsive design so who knows how well it’s going to look across resolutions, the widgets seem to be using iframes which is something that the SLMH uses in places with disastrous results on mobile devices, and the person who gave the webinar could not tell us what the results of their mobile testing were. A big point they are selling the new dashboard on is the ability to customise it, but I have learned from the webinar today that this is rather disingenuous, as it can only be customised from the Admin Pages for the entire institution (or possibly node / term level, but that’s not much better), so there is no user customisation which is what I expected from their marketing and what has been available in Blackboard for many years now. The Course Checklist feature is also no where to be found. This is a really nice little tool which lets students see the whole schedule of the course at a glance, but it is only available on the classic course home. When I have queried why it was not available in SLMH I was told that it was a bug and to wait for the new dashboard, and now today I’ve found out that this is not the case, that the feature is gone and that the best I can hope for is that similar functionality might be implemented in a calendar view at some unknown time in the future. The interface of the new dashboard is not customisable either. The colour scheme (blue, white and grey), like the new Threaded Discussion tool, cannot be changed to match Sunderland’s branding, and other attributes like the font and font-weight are also fixed. Very disappointing.

Of course, being a software-as-a-service solution we will have no choice but to implement the new dashboard at some point, and probably sooner rather than later in spite of my reservations as there is no development being done on either of the older course homes which means no bug fixes. I can only hope that many of these issues are ironed out before general release, as my thoughts were echoed by participants on the chat many of whom are, or will be piloting the new dashboard.

The attached screenshots show the Classic course home, basic and dated, but it does have the oh-so-useful Module Checklist; and the much better SLMH which includes the Chat and recent activity widget and an Upcoming widget. I haven’t included a screenshot of the new version as the only place I have seen it to date is in these private webinars.

Leave a Comment

Pearson PAB (Product Advisory Board) Webinar

Watched a recording of Pearson’s PAB webinar which was held in lieu of the conference in Denver, where they demonstrated many new features which have either been made live recently or are due for release over the coming year. Highlights were the new look Threaded Discussions tool which is being rolled out piecemeal now, the new course dashboard which is going to replace Social Learning Module Home (here’s hoping for a catchier name this time round), the Android app, and the long overdue notifications centre – something which our students are clamouring for. Also tucked away, but of particular interest to me, is the new ‘External Tool’ menu item type which should make it easier for academics to deploy the new version of Turnitin we have been working on, which uses the standard LTI from Turnitin instead of the Dropbox integration which Pearson developed but that doesn’t work terribly well.

Leave a Comment

Google Analytics on SunSpace

sunspace_analytics

One of my little areas of expertise at Northumbria was providing analytics data and reports on Blackboard usage and it’s something which was missing here at Sunderland, for the VLE at least. Unfortunately the way Learning Studio works it has not been possible to implement Google Analytics tracking code system wide, but I’m not easily deterred! I found a way to embed the code into an announcement on the landing page so that we can at least get client side data: numbers, technology, location and mobile use which is all useful in informing development.

If the report looks a little familiar, well, that’s just because great cooks bake nice cakes no matter what kitchen they’re in!

Leave a Comment

WaLTS Web Pages Revamp

The team’s web pages on the University’s website were pretty out of date when I started, though I actually first noticed a few weeks beforehand when doing some research on the department and found some broken links. After I started I soon learned that the pages were actually much worse, with many having gone without revision for over two years. It turns out that it was my predecessor who primarily took responsibility for these, so I was happy to adopt the responsibility myself.

I thought it would be a nice, quick and easy job. I was mistaken. Once I delved into them I found that there were many more pages than I realised, going as deep as four nested levels in some places. I have flattened this structure right out, reducing it to two levels at most and standardising the template, entirely removed around half of the pages – most of which related to the 2012 project to transition the VLE from WebCT to Learning Studio – re-wrote many other pages, tidying things up and making the language more professional and affirmative, and of course fixed all broken links and email addresses.

Finally I created a banner in the style of other departments within SLS and added a sidebar on the right for a Twitter widget and a feedback button, replacing the old link on the left which was out of place as it links to another system.

http://sls.sunderland.ac.uk/walts/

Leave a Comment

WaLTS on Twitter

One pithy definition of madness is that it is the act of repeating the same action over and over and expecting different results. So it was that back in April I was asked to create a Twitter account for the team which, having done so, was promptly ignored and left to languish. To this day all six glorious tweets from that account were made by your humble author. Today, or rather spread over the past couple of days as a ‘bitty’ job, I have resurrected the old ‘LDS’* Twitter account and renamed, revamped and brought it back into use.

So, am I mad? My intention behind this is to have a more informal avenue of communication between the team and our customers, but to be a success it will require active engagement and relevant content. UoS_WaLTS has one thing going for it that NorthumbriaTEL didn’t: me, enthusiastic and not going anywhere anytime soon this time.

Another little job I’ve been doing for similar reasons of engagement is improving the announcements page on SunSpace, which was just dull black on white text, trying to make it look nice and keeping the content current so that it isn’t reduced to just annoying wallpaper which people scroll over to get to their courses, to which end I have also embedded a widget for our Twitter feed into the announcements for all users section.

* Learning Development Services, the old name for my team before merging with Web Services.

https://twitter.com/UoS_WaLTS

Leave a Comment