It’s official. Contracts have been signed, Canvas integration has been tested, and now we’ve had our first batch of training for Panopto, the University’s new lecture capture system which we’re branding internally as reVIEW.
This session covered how the system will be accessed and used by Viewers (students, essentially), and Creators (lecturers). There wasn’t a lot to cover for Viewers. We’re planning on having everything integrated through Canvas, so it’s just a case of navigating to the relevant item or accessing the reVIEW tool in the menu. Playback speed can be varied between half and twice speed which is nice, caption styles can be customised, and the search functionality is impressive – it doesn’t just work on text, but also for spoken terms thanks to a machine speech-to-text engine.
It is possible for students to be given access to create their own videos by using ‘Assignment’ folders which can be configured for them by module tutors, and simple quizzes can be added at any point throughout videos to check comprehension. Results for which can be fed back into the Canvas Gradebook.
There was much more content for Creators as would be expected, covering recording and editing. Recordings can include multiple sources, including any webcam and mic connected to the computer – and more than one source – PowerPoint presentations, and your entire computer screen. Recordings are uploaded to Panopto’s servers progressively which will help in a lecture theatre environment where people need to get out quickly for the next class. Editing and post-production is done through the web using HTML5, no plug-ins required, and it is possible to edit individual sources in isolation as well as the entire video.
Closed captions can be added automatically based on the speech-to-text engine which Panopto is using to drive the in-video search, but it is also possible for Creators to request a variety of human transcription services which are contracted for separately. We’ll soon discover how well it can handle academic language and the interesting range of accents we have in this neck of the woods.
Attended an internal training session led by our HEA panel leader covering what is involved in the process of mentoring people through our scheme and assessing claims. This was particularly focused on the new dialogic route for accreditation which we brought in last year. In a few weeks time I’ll be able to shadow one of the candidates on our next dialogic panel.
Attended the ALT North East user group today at Durham Castle* to network and share practice with learning technologists from universities and colleges across the region.
A representative from Jisc was there to provide us with an update on their Learning Analytics project, which continues to look ever more impressive every time I see it. This was followed by a demonstration and talk about Special iApps which have been created to help children with special educational needs. Then we had two sessions on the use and value of Microsoft Teams in education, one from a Microsoft representative and one from a colleague at Teesside who have been using it in the wild with good results. Finally there was a demonstration of the survey tool BluePulse via webinar.
Second day of Canvas fun, my first CanvasCon. Alas not the global one they had in Colorado this year. It was still huge. 650 attendees from 300 institutions across Europe, up from only 35 institutions four years ago.
The day began with a corporate keynote where they talked about the success of Canvas and what new things are coming – a Canvas Commons preview tool, yay! There was an overriding theme of small, incremental changes from the ground up, mirroring the agile development method behind the Canvas product itself.
The afternoon keynote by Alex Beard focused on pedagogy rather than technology, as he talked about innovate ways in which students learn across the globe such as the MIT Media Lab where students are given a huge amount of freedom to construct their own learning. One of the freebies that Instructure were giving away was a copy of his new book, Natural Born Learners, which I’ve already skimmed.
In between the keynotes were a diverse range of breakout sessions and the ones I attended were a mixed bag, some interesting insights about how Canvas is being deployed and lessons learned from some institutions, but some of the other sessions I didn’t get a lot from.
And of course then there was the networking, with time available in between sessions, at lunch and a cocktail reception at the end of the day to meet people and chat about their experiences.
Now how do I get the boss to agree to send me to the next one in Long Beach in July?
Isn’t it nice that the University are letting me get out and about again? In London for two days for the UK HE User Group today and CanvasCon Europe tomorrow.
Today was really useful. Around 40 of us from all over the country at St George’s Medical School in Tooting sharing our experience as Canvas users. In the morning we had a demonstration of anonymous and moderating marking from colleagues who are currently piloting it with positive results, though they noted that they have found a limited number of ways to circumvent the anonymisation. However, as they are all quite obscure and difficult they remain confident in the tool and are rolling it our further. It will be interesting to see how Instructure’s offering here compares with Turnitin’s pending anonymous and moderated marking tool.
Also in the morning we had some group discussions on different ways of using Canvas for assessment and feedback to stimulate discussion and share ideas and best practice.
In the afternoon we were joined by representatives from Instructure who gave us updates on their developments and allowed us to grill them quite freely. This is always an excellent opportunity to use our collective influence to nudge Canvas in a direction which helps to address the needs of the UK sector. The anonymous and moderated marking tool for example, is something that was proposed by, and has been driven by this group.
Instructure provided us with a progress report on our Top 10 priority development list from last year, as shown in the photo above, which shows ‘Non-Scoring Rubrics’ and ‘Analytics to include Mobile App Usage’ as complete, and most of the others in the design or development stages. Finally, we voted on the new Top 10 list for 2018-19. From a long list of suggestions collated prior to the User Group, each person at the group was allowed to vote for three issues, and I voted for QuickMark style functionality in SpeedGrader, improved Group functionality, and the ability to set Notifications by course. All things which I’m being pressed for by our academic community at Sunderland.
For the past couple of months I’ve been experimenting with Mastodon, the social media service, not the metal band, although they are pretty good too – click on ‘play’ above and enjoy while you read this. For the non-technical, Mastodon looks and works kind of like Twitter, but without the Nazis, so it certainly doesn’t feel like Twitter. For the technical, it is an open-source federated micro-blogging social network, part of the fediverse.
To unpack that, open-source means that the Mastodon software is made freely available, anyone can read and contribute to it, and anyone can set up their own Mastodon server, or instance as they are known. That’s a very good thing. It removes the predatory capitalism of commercial social media companies which use extremely sophisticated psychological tricks to keep you addicted and get you to disclose as much personal information as possible to sell you adverts.
Federated means that Mastodon is decentralised, it’s not ‘one thing’, one server, with one owner; there are thousands of servers that run Mastodon, and everyone, no matter which Mastodon instance they have joined, can talk to everyone else (with some limitations – it is possible for instance administrators to completely block other instances, but this only tends to be used for instances that post illegal content). Each Mastodon instance has it’s own administrator, moderation team, and code of conduct, so there’s no faceless central point of control and authority haphazardly applying arcane rules.
So that’s what it is, but what’s it like? The TL;DR version – it’s good! I like it, and I think it has a lot of potential.
Having been built from the ground up to be open, free, and distributed, the culture and ethos is very different, better, more courteous, than what you get on Twitter and Facebook. A lot of lessons have been learned, and this shows in features such as the easy way you can control who sees something you post on Mastodon, a Toot, and the content warning feature which lets you mask out a post until people click on the content warning description. Even just having this features makes you think about the issue and whether or not something you are about to post could inadvertently cause someone distress. Trolls and unpleasant people do still exist on Mastodon of course, but there is something about the design and the culture which I think nudges people towards behaving better, and the bad behaviour I have seen on Mastodon has been of a vastly lower order of magnitude than you get on Twitter these days.
It’s not perfect of course, nothing is. To an extent, it feels like a Twitter refuge at the moment, and it needs to be more than that. If you use Mastodon on a desktop browser it looks a lot like TweetDeck with a four column layout, and mobile apps (I’ve been using Amaroq) look a lot like Twitter clones too. To be successful it needs to be more than the anti-Twitter; it needs a strong, positive identity of its own, and I’m not sure it quite has that yet. That said, I joined Twitter early when people flocked to it because it wasn’t Facebook, so if that’s what it takes to make Mastodon take off, so be it. The other thing it needs is network effect, people. There are around 2 million Mastodon users at the moment, a drop in the ocean, and when I first joined I found only three people I actually knew in meat-space. In comparison, Facebook has over 2 billion active users, and there are 330 million on Twitter.
As Mastodon is still relatively new (launched in 2016), there are still common features and functionality missing that you might expect to find. A particular grievance of mine is an easy way to search and embed animated GIFs, though I have been informed by some of my friends that this is a great relief to them. I’m trying hard not to be wounded by this. And I would love it if there was a better way to backup your account and switch between instances more freely. At present you can only export and import the list of people you follow, block or mute, and while you can export your content you can’t import it back into another instance.
This has all been very much my personal experience, and as it’s been almost entirely positive I’ll be sticking with it. In fact, Mastodon is now largely the only social space in which I actively engage. Looking at a bigger picture I would love to experiment with a cohort of students to use Mastodon for educational purposes in place of Twitter or Facebook to see what their experience is.
Now to throw some resources at you! This Lifehacker article goes into more detail than I have about what Mastodon is and how to get started. Some suggested instances to join are: Mastodon.social which is the flagship instance, run by the lead developer; Scholar.social which is good for general academically inclined people; and Humanities.one which looks good if your interest is specifically in the humanities. If you’re emigrating from Twitter you can use the Bridge tool to find anyone you follow on Twitter who has also jumped to Mastodon. This Github doc has a great curated list of apps and tools. Finally, to help find interesting accounts to follow, you can use this list of Awesome Mastodon accounts on Github, and the Trunk Wiki which groups accounts by topic. No category for learning technologists though. I may do something to address that.
And of course you should totally join Mastodon and follow me @email@example.com – or read my public profile here: https://scholar.social/@sonya.
Attended Turnitin’s annual conference which this year was largely devoted to the issue of contract cheating, students paying other people to write essays on their behalf. A problem which has been growing for some time, but which came to the fore in 2014 with the MyMaster scandal in Australia. They also had demonstrations of an imminent anonymous and moderated marking tool which looked great, and a new Code Similarity project which is a development of MOSS for checking computer code for similarity.
The new product they have to help with contract cheating is called Authorship Investigation and aims to try and detect cheating by comparing work submitted by a given student over a period of time, analysing such things as word and punctuation usage, richness of vocabulary, and document metadata – looking for obvious things such as an unusual author or editing time. The hands-on demonstration was quite good, especially for software still in beta and not due for release until next year. A number of us at the demonstration raised the same type of concerns though. For example, when I’m writing work I create a new document for every draft, and therefore the final file that I actually submit would show a same day creation date and very little editing time, both things that would be flagged up by Authorship Investigation as suspicious.
Also demonstrated was just how easy it is to get assignments from essay mills, and how predatory they are. A funny anecdote was about someone who was researching contract cheating. They started an online chat with someone from an essay mill site, who then proceeded to offer their services to write the paper for them!
This is a hard problem Turnitin are trying to solve, much harder than identifying blocks of text which have been copied and pasted from elsewhere, and most of us at the demonstration were a little skeptical about their approach. Of course, Turnitin is a technology company and they have devised a technological solution (to sell), when a better solution is arguably a pedagogic one, designing out the ability for students to outsource assessment work by moving away from essays and using approaches such as face-to-face presentations. Knowing your students and their work personally is also likely to be better than relying on algorithms, but of course this is much easier with smaller cohorts.
There was also very little discussion about the context of this, and what has caused the issue to arise. In most of the West we have commodified tertiary education, turning it into just another product that’s available for anyone who can afford it, so is it any wonder that those with the means take the next step? Nevertheless, this is the world we find ourselves in and essay mills aren’t going to go away. Calls to legislate against them, as worthy as that may be, will have the same problems as trying to prohibit any online content in that it can only apply to UK based companies, and while technological solutions may help in the short term, they are no panacea as methods to circumvent them will soon appear in what is an ever escalating arms race.
HR caught up with me again, this time making me take my fire safety training. Which was fair enough, as according to my records on here I haven’t done this since 2014. Not a lot has changed, it’s all fairly common sense advice – understanding how fires start and how they can be stopped, how to prevent by keeping the work environment clean and tidy, not using socket adapters, etc., and what to do in the event of a fire – basically, raise the alarm and leave via nearest route, or use the appropriate extinguisher if safe to do so.
Me, again! Less than a year after getting Fellowship I’ve now been awarded Senior Fellowship following encouragement from colleagues who made me realise the extent of the impact my work has. My case studies where on the support I provided in getting the University’s first MOOC approved and running, and my involvement on our recent VLE replacement project, which included writing the statement of requirements that bidding vendors had to demonstrate they met, and planning an extensive staff development programme.
As a result of a mini restructure in the CELT my job, the Senior Learning Technologist, will cease to exist in a few weeks. That was a little disconcerting when I found out, but I am thankful to have been slotted into a new role in the new structure at the same grade, so on the 1st of August I will become the new Learning Technology Coordinator for Learning Materials Development. Longest. Job title. Ever.
What it means in practice is that my role has effectively been split into two, and there is going to be another role at the same level with responsibility for managing the VLE. The team is also being split, and I’ll have two people working on learning materials with the third learning technologist working with the VLE coordinator. In reality there will continue to be a lot of cross working within the team, especially until the VLE coordinator is appointed, but I’m positive about the possibilities the new structure offers. I can make something of this new position both for myself, to push my career in the right direction, and for the university to support our drive into new areas of independent distance learning by working with academic teams to produce appropriate high quality, pedagogically sound content.