Sketchfab today released the results of a survey giving an indication of the virtual reality market. It was no surprise to me to see the HTC Vive leading the way across the board, it is the standout device of the current generation.
Some caveats of course. The headline says the data is from their user base of 700,000, but read the text next to the little asterisk and you’ll see it’s actually a survey of only 1,000. Still not a bad sample size, but the users of Sketchfab are creative types using Macs and PCs which skews the results. The PlayStation VR, for example, makes an appearance on a few charts with some low level usage, but I would expect that there are a lot of console gamers not using Sketchfab which means that a portion of the VR market is missing from their report.
I had the chance to use a friend’s Oculus Rift CV1 over the Christmas break, the successor to the DK2 and the final retail version of the Rift. I had high hopes but was sorely disappointed! The quality of the HMD is an improvement, much more refined, but it also felt so much heavier, the whole time I was wearing it I felt like my head was being pulled down. It wasn’t pleasant and I couldn’t wear it for long. I tried three different games with very mixed results, Eve Valkyrie working best, and Project Cars the worst. No amount of jiggling and fiddling with the headset or settings was able to make the game anything other than an unplayable blur for me. A sharp contrast to Driveclub on the PlayStation VR.
ALT has published the report and data from their second annual survey which can be dowloaded here. Interesting reading as they now have comparative data from last year’s survey so you can see the trends and changes.
No signs of the monolithic VLE going anywhere just yet, and interest in the field of data and learning analytics is continuing to grow. I was a little surprised to see open badges so far down the list, but as a colleague in another department said to me a few days ago, employers don’t know what they are or how to value them, and as a consequence students just aren’t interested.
The old customer support and Google Analytics reports that I have been doing for the past year, and in some form for many years now, were good as far as they went but didn’t encompass all of the other work that we do, the services that we provide and the systems we support. In an effort to provide something that goes a little wider I have created this new style of report which picks out the highlights of the two old reports and adds in what measures are available from our other systems. I actually did most of this work a few months ago, but it took time to be approved. With agreement from the big boss I am also now publishing this report publicly on our website.
Part of what prompted this was my new found liking of Piktochart and the desire to turn my reports into more of an infographic, but in the end I stuck with Excel as there were a number of charts with data that I found I was just going to end up having to screenshot and import into Piktochart, which kind of defeated the point.
After our regular conference call with Pearson the team had an informal training session from a member of their Enterprise Reporting team. This came out of a problem I had a couple of weeks ago when I ran a simple report to list all units and items within a given module space and only got four results from a course which had six units and a couple of dozen items. We discovered that the items that were returned were the gradable items, even though the option to select only gradable items was not selected. So the question was why it wasn’t working as expected and returning all results. I don’t have a detailed explanation, but I did learn that there is what I would describe as a ‘quirk’ with Enterprise Reporting that means it only likes reports that include a measurement of some kind. Adding ‘Activity Minutes’ to my problem report resolved the issue.
We got some other good things out of the training too. A greater understanding of how nodes work and how they relate to courses and students, and with that a realisation that we cannot rely on these to get reports on what we want, which is which faculty or department a student belongs to, but we do now have a plan to use one of the extended user property fields as a custom field that will serve this purpose for us. And finally we got a data dictionary which will be extremely useful.
Adding to our range of Quick Start guides I have written a new one on Learning Analytics and the tools we have available. I will be following this up shortly with one which goes into more detail on Enterprise Reporting, covering the standard reports which are available and how people can get viewer accounts from the team.
Something which surprised me about WaLTS when I started was the lack of management information on the work we do to support our customers. There has been a few spot audits to analyse busy periods, but nothing coherent or consistent, so I asked the team to start recording resolved work using a simple form and then presented the results in a report for the benefit of our senior management. Those graphs will take a little time to fill out, but we’re off to a good start.
One of my little areas of expertise at Northumbria was providing analytics data and reports on Blackboard usage and it’s something which was missing here at Sunderland, for the VLE at least. Unfortunately the way Learning Studio works it has not been possible to implement Google Analytics tracking code system wide, but I’m not easily deterred! I found a way to embed the code into an announcement on the landing page so that we can at least get client side data: numbers, technology, location and mobile use which is all useful in informing development.
If the report looks a little familiar, well, that’s just because great cooks bake nice cakes no matter what kitchen they’re in!
As part of my handover arrangements I have had to write a set of instructions on how to compile the learning analytics report I have been responsible for. This document alone was such an extensive piece of work that it warranted a separate project in my handover to do list and took me pretty much an entire day. The resulting seven page, 3,000 word document covers how to update and complete the master spreadsheet, where to find all of the various measures in Google Analytics and Blackboard, and how to create the report on PebblePad usage, the most complex one as it involves database queries and I was handing over to someone with little experience of databases, so the instructions needed to be detailed and precise.
For the past three years now I’ve been running Google Analytics on Blackboard and compiling a monthly report for senior management and steering groups. A standard was agreed for what this should contain by consensus fairly early on and it has changed little since, until a couple of months ago when, due to the changes in management, I was asked to revamp the report to remove some things which weren’t required any more and to report on anything new which I thought pertinent. The biggest change was the request for a ‘commentary’ on each page explaining some meanings and trends. I have also integrated the PebblePad usage by Faculty report I wrote last month into this, as PebblePad has a tendency to be overlooked and almost forgotten about.
A complex report for a simple request – how many people are using PebblePad per faculty? Complex because the user data in PebblePad doesn’t contain any information beyond key, username, forename, surname, email and a few other non-pertinent bits and bobs. But I am not easily daunted.
One of these ‘non-pertinent’ bits of information is a ‘last login’ date so I was able to restrict the report to people who had logged in during the past thirty days. I ran a query on the PebblePad database to get all relevant username data for this time period, and then ran a query in Blackboard to get all user data full stop. Why? Because the ‘user’ table in Blackboard does have a field for Faculty. Well, actually it is in a different field because of the way the user accounts are imported from Active Directory, but it was sufficient. Then it was simply a case of importing both resulting CSV files into an Access database and running a join on the username.
Unsurprisingly our Faculty of Health, Community and Education Studies were the biggest users, but it wasn’t as clear cut as I had suspected. They accounted for just under half of all usage, with Engineering and Environment accounting for around a quarter, and the remaining two faculties and service departments sharing the remainder.