Press "Enter" to skip to content

Tag: Audience

PG Cert AP: Day 2

The morning session picked up EDPM05 where it left off the week before, discussing curriculum design and setting learning outcomes. There was a discussion on the distinction between learning outcomes which are for students, and learning objectives which are more of a tool for staff when designing the curriculum. Advice given for writing good learning outcomes was to phrase them in the future tense, and make them achievable, assessable and easy for students to understand. It was recommended to build each outcome around a measurable verb, e.g. reflect, hypothesis and solve for high level outcomes, and describe, identify and measure for low level. Bloom’s taxonomy was cited as a source of inspiration in looking for these. In terms of practical considerations and UK HE culture, we were advised not to set too many learning outcomes as they need to be assessed, and too many learning outcomes can quickly lead to assessment overload.

To put this into practice we were given an example from a real-world module which, when inherited by the current programme leader, had 24 learning outcomes, and we were asked to find ways to reduce these. The programme leader actually got these down to 9 by clustering a number of them. Removing any of the outcomes wasn’t a possibility because that would have constituted a major change and triggered a re-validation.

The afternoon session was for EDPM08, the optional module on Digital Learning which I am teaching on, so I was there not as a participant but as a teaching assistant to support the discussions that were taking place. Today’s session utilised an audience response system so there was a discussion about the merits of using dedicated handsets over newer app and text based systems such as Socrative and PollEverywhere. Research was cited showing that such systems increased student enjoyment and engagement.

There followed a live application to get learner’s feedback on a discussion of Marc Prensky’s argument that today’s learners can be classified into digital natives or digital immigrants, depending on whether or not they have grown up with the internet. Critiques of this argument that we discussed included evidence that the multitasking Prensky claims digital natives are capable of is actually detrimental to performance, that he creates an artificial barrier between generations, and that the ability to manage the types of non-linear and non-hierarchical leaning spaces generated by the use of hyperlinking is more a matter of a person’s working memory capacity and pre-existing knowledge than any skills they may have gained by growing up with modern digital technologies.

Leave a Comment

OMBEA Audience Response System

Attended a webinar demonstration of OMBEA, an audience response system similar to TurningPoint which can use both old-school ‘clickers’ or a browser based response. It seems good, but it didn’t ‘wow’ me. The best part of the system is the ability to upload responses to any quiz or survey to their cloud-based system which saves the results and gives you options to perform some analysis on the data.

I’m not convinced that these traditional audience response systems offer great value for money in the era of online tools such as Socrative, mQlicker and Poll Everywhere, and the ubiquity of smartphones.

At Sunderland we have SMART Response handsets which, for me, typify the problems with them and prove the need to move to entirely software driven solutions such as Poll Everywhere. The response handsets are expensive, the batteries run out (from personal experience, I would estimate that around 5-10% of handsets are not going to work at any given session due to faults like this) and the numbers of handsets we have is a mystery as they are spread out between different departments and faculties which guard them like Gollum. Getting enough together for a significantly sized session can be a nightmare.

Last year I was asked to advice on whether or not to use our SMART Response handsets or an online tool for a conference with an expected attendance of around 200. I recommended Poll Everywhere, but a senior manager was concerned that not all attendees would have a smartphone and thus some may be excluded. So I ran a poll to get some evidence and numbers, the results of which can be found on the team’s blog here. Only 2% of students said they didn’t have a smartphone or tablet, rising to 4% of staff, which I would argue is going to exclude less people than faulty audience response handsets. With Poll Everywhere, which allows people to respond via SMS, I think it’s fair to say you are doing your absolute best to accommodate the 2-4% of non-smartphone users as well. If you really felt the need to go further, well, we are also now in an era where £50 can buy you a pretty decent tablet.

Leave a Comment