Tuesday, 29 September 2015

TELFest 2015 Leaderboard

During September 2015's TELFest (a week long festival consisting of workshops, discussions and drop-in sessions related to Technology Enhanced Learning) we introduced a leaderboard to enhance participation throughout the event, and to encourage the use social media to share experiences amongst colleagues that were unable to attend. Having experienced the leaderboard at the UCISA Spotlight on Capabilities Conference in June, I was interested in using it to introduce ideas related to Gamification, and bring an extra element of fun, to TELFest. The leaderboard is generated by a website called rise.global, which automatically calculates the scores for tweets that contain a specific hashtag, and, following some pointers from Fiona MacNeil who had set it up for the UCISA event, I set up a leaderboard for TELFest. Given the aims behind using the leaderboard, I decided that points should be primarily awarded for tweeting with the #TELFest hashtag and there were additional points for attending drop in sessions and tweeting TELfie’s (TELfest selfies). Below is a breakdown of the points that could be earned:

Tweets with the #TELFest hashtag
1 point
Being Mentioned by someone else
2 points
Having your  #TELFest Posts Retweeted
3 points
Tweeting a TELFie with the hashtag  #TELFest (TELFest, Selfie)
3 points
Attending a drop in session
5 points

Each day we saw the top tweeters changing positions and there was healthy competition amongst TELFest participants.  

To keep tweeters motivated, automated tweets were sent out every evening, informing them of their position on the leaderboard.

Twitter activity increased significantly compared to September 2014, there was a tenfold increase in the overall number tweets, a tripling of the number of tweeters and, on the Friday, TELFest trended in the Sheffield area, meaning that it was promoted to local users on the main twitter interface.

An additional benefit of promoting the use of Twitter through the leaderboard was that it helped to capture the variety of views and opinions shared by participants during the event. We were then able to use the tweets to create daily blog posts summarising these discussions using Storify, allowing us to produce a record of the day’s events for participants to look back on and to give some insight into the discussions for those unable to attend.

While the leaderboard was highlighted during the Gamification session as an example of a method to encourage participation and motivate learners, it is hard to say whether, in this case, the leaderboard led to an objective increase in Twitter usage. Early feedback indicates that its’ introduction did motivate some people to tweet more than they might ordinarily, yet others stated that they were unaware of the board. Another reason why the increase in the use of Twitter at TELFest this year cannot be solely attributed to the leaderboard is that we integrated Twitter directly into some of the workshops. It is however clear that the leaderboard did not appear to influence the number of colleagues attending drop-in sessions.

We closed the board on Friday at 12pm and as a token gesture awarded chocolate medal to colleagues that were top of the board - congratulations to Gary, Nik and Maria.

Final Leaderboard

Friday, 24 July 2015

Taming the grade centre with colour

One of those “why didn't I know you could do that” moments that I thought was worth sharing. This is where I find out that I'm the only blackboard user who didn't know about it but here goes anyway.

At the recent North England Blackboard User Group Meeting (NEBUG) there was a really useful presentation from Adam Elce (North Notts college) showcasing the blackboard templates and VLE audit framework they use. His presentation may be worth another blog post in itself but he casually threw in the fact that you can colour code student scores in the grade centre.

Now I've always found the grade centre in blackboard to be an unwieldy beast at the best of times…. very powerful but unwieldy.

In our Legal Practice course we make extensive use of MCQs but the default grade centre just displays numbers that could be out of any total, the quizzes may vary from 3 questions up to 30. A quick visit to the” manage” tab in the gradebook, drop down to “grading colour code” tick the box to enable colour coding then just build your own criteria and colour scheme.

It just so happen in this one we have gone for purple for very high scores, the reds are less than 50% (a fail), dark blues are in progress and various shades represent the grade ranges in between. This was a 30 second job to set up, .

Now the whole grade centre becomes a lot more useful as you can easily see a student’s progress across the quizzes without having to remember what each score was out off. You can of course toggle colour on and off if it all gets a bit too much.

Gareth Bramley, University Teacher on our LPC course writes "the colour coding can usefully be adapted so that it highlights various grade boundaries, and the colours make it a lot easier to assess how the students enrolled on the module have performed in each quiz"

Hope this is useful if you didn't already know about it. Next time I’ll be delving more into reporting from the gradebook and quiz analysis tools.

Monday, 6 July 2015

Handy App: Post-it Plus

Here at the University of Sheffield my role often involves working with colleagues to brainstorm and discuss ideas. To help facilitate this, I often rely on post-its enabling participants to share ideas and skim through the responses of others. If you use post-it notes in this way, like me you probably take a picture of the post-it notes after the event so that they can be shared more widely. One of the problems with taking a picture is that it is difficult to add, edit, or reorder the post-its once you have your picture. So I was pleasently surprised when I came across the 3M post-it app (only available in the App Store), which allows you to do just this.

The app allows you to take a picture of or upload a picture of a 'bunch' of post its. It will then recognise each note in the picture (or let you outline where a note is if the app is struggling to do so). You can then:
  • Edit a note
  • Reposition each note
  • Add a new note
The picture below is from a TEL meets session that we host monthly at the University, the event brings together learning technologists across the institution to network and share ideas. In a recent event we asked delegates to consider what discussion points they would like covered in forthcoming sessions. After uploading an initial picture of the post-it notes to the app,  I have been able to shade out the names of people that suggested an idea to anonymise them (in red).

You can move around each note individually or automatically align them. 
Finally, you can see that I have added some new post-it notes using the keyboard and my finger (for a freestyle look) (see the light yellow post-its in the centre below). 
You can also export the changes that you have made (as an image, powerpoint or word) or share your notes with other users. I stuck with the free version of the app, but for a fee you can have the option to change the the colour of different post-it notes. 

Would you find this app useful? Is this something you will use? Add a comment and share your thoughts! 

Wednesday, 1 July 2015

University Sheffield International College and technology enhanced learning: Developing MOLE Courses and the working relationship

Over the past few months the Technology Enhanced Learning (TEL) team here at the University of Sheffield have been working in close partnership with the University Sheffield International College (USIC) in developing their courses on our Blackboard 9.1 VLE (Virtual learning Environment). The VLE here at the University is branded My Online Learning Environment (MOLE). 

Previously, USIC were users of the Moodle platform, and at the point we started working with them, had limited or no experience in the Blackboard 9.1 environment. 

My main contacts for USIC were Barbara Gardener (Learning Technologies Manager), Tom Pyecroft (Learning Technologist) and Laura Murray (Academic administrator). They are all employed by StudyGroup

As there would be no direct migration of courses from Moodle to Blackboard either, we found ourselves in the position of having to start course building from scratch. On the face of it, this sounds like a bad thing, but it represented a great opportunity for us and the module developers in USIC to be able to rethink their curriculum delivery. 

At the start of discussions it was clear that USIC wanted a consistent approach to course design across their programmes. This consistency would help a) ensure students experience with the VLE was uniform and of high value and b) help make course management more efficient. USIC had a number of module developers available to them who would be in charge of developing these courses. These staff, as mentioned above, had some limited or no experience on Blackboard.

So, the first question was... How would we develop this consistent approach to course building, whilst giving the module developers some hands on time with blackboard? ...The answer came in the form of a full day training session for the module developers, at the Corporate information and Computing Service (CiCS) training room (and Hicks Building due to availability!). 

We split the day into two sections, morning and afternoon, with a much needed lunch break in the middle! 

The morning session was dedicated to the “nuts and bolts” of course building in MOLE. We demonstrated the basic elements of course navigation and structure and building/deploying content in the system. Nothing too advanced was attempted, and this was important as often the key to good course design is in making it simplistic.

Staff in the group session wrestling with course template design - Image courtesy BGardener - StudyGroup and USIC

The afternoon session was all about building USIC course templates in MOLE. Barbara had very kindly put together a cards activity that really helped invite discussion and debate. 

The cards activity involved attendees being given a set of 40 cards. These cards contained a single item relating to course design and delivery. For example we had: “All items include descriptive information”, “All grades available through grade centre”, ”A class wiki”, “Formative tests”,”Adaptive release”, “Content collection”. Attendees were then asked to put them into three separate piles:
  1.  Launch - These items should be available at the launch of the courses 
  2. Intermediate - These items could be delivered in the near future but after launch 
  3. Exemplary - These items would require more thought and investigation but are items that in an ideal world they would wish to have in the courses.

MOLE training card activity (3).JPG
Cards used in the session - image courtesy BGardener StudyGroup and USIC
The activity generated some really useful discussion around the key elements that needed to be in the courses from the get go, as well as the higher level content that would take more time to implement. A couple of the key areas covered as being essential (and therefore launch) were:

  •  Use of the content collection in managing overarching programme content 
  • Directing students learning through effective use of adaptive release.

Importantly the theme of exemplary course design (something we are having a real push on in TEL at the University of Sheffield) was woven into both the morning and afternoon sessions. This theme was highlighted in another activity we devised, which involved attendees being enrolled in both an “exemplary course” and a “bad course”.

They were split into two groups and asked to do the following:
  1. Try and improve the bad course
  2. Provide feedback to the group as to what they would do to improve it, if they had the time 
The best thing about this activity was the fact that both groups had some really good ideas around what a MOLE course should achieve. The main idea being to avoid it being a file repository and instead have it enhance learning and encourage collaboration. Both groups again agreed that consistency across courses was key. For example: clearly labelling content with descriptions, formatting, chunking up learning content into manageable sections and displaying it correctly through combined use of the navigation menu and content pages. 

 We finished at 4pm tired, but with a sense that we had achieved some clear ideas about what the USIC courses will look like, and of course it also aided the forging of a good working relationship between TEL and USIC. But we weren't finished there….Day 2 beckoned... 

 The following day saw the TEL team train USIC staff on the use of PebblePad, with demonstrations of the versatile ways Pebble+ can create templates and workbooks, and ATLAS, the institutional space where assignments and assessments can be managed. Some valuable discussions were had at how the system can be used for logging achievements and capabilities, and how workbooks and webfolios can be used to aid student reflection and learning. 

 The module development team also took part in a workshop introducing Smart Notebook software which will be available in all USIC classrooms and enables valuable collaboration opportunities. The team had seen the interactive whiteboards and software previously at a classroom technologies drop in session run for all the teaching and professional service staff in March. 

Many thanks to Barbara Gardener  and Pete Mella (University of Sheffield TEL team) for their contributions to this blog post. 

Stay tuned for more developments over the summer on this topic! 


Monday, 27 April 2015

EMA and the White Rose Learning Technologists' Forum 21 April 2015

I was very pleased to be invited along to the latest WRLT Forum on the 21st April 2015. One of the main drivers for my attendance was down to my involvement with Electronic Management of Assessment (EMA) here at the University of Sheffield; this particular event was dedicated entirely to this theme.

 It was held at York St John University’s Skell Building on a frankly rather beautiful sunny spring day, which made both the University and the City of York stand out even more; a very pretty location!

We were also extremely lucky to have with us Lisa Gray (Programme Manager) and Lynette   Lall from JISC (Joint Information Systems Committee) who would be conducting the EMA workshop with us. Lisa Gray has completed a lot of extremely important work in the EMA arena where Institutions, including ours, are really looking for the ways forward in the implementation of electronic assessment at scale.

We kicked things off with a brief introduction to the Forum by Roisin Cassidy (TEL Adviser) at York St John University followed by the main presentation from Lisa Gray.

JISC - EMA Project Presentation


Lisa Gray Presenting at the EMA WRLT - image courtesy of Sarah Copeland, University of Bradford

Lisa gave the forum an overview of JISC’s EMA project. The presentation started with some background and context, which included the completion of a 3 year technology enhanced assessment and feedback programme. EMA was one of four main themes within this programme, and this background work helped highlight that there were clear tangible benefits to the use of EMA, but that there were also some serious hurdles to jump over in terms of its implementation. This essential background work led onto the EMA study itself.

We were presented with the study’s headline findings which included charts around levels of EMA system integration, variations in business processes and the variability in the take up of e-marking and e-feedback.

In particular we were shown a chart that showed to us very clearly where the main “pain points” were in adoption of EMA.  The top spot on this chart went to system integration (lack of), with staff resistance coming in a close second. Perhaps not surprisingly, student resistance was last on this chart…

Figure 1 - Pain points in EMA

Image courtesy of  JISC. Taken from the "Electronic Management of Assessment (EMA): a landscape review" publication

The study then explored the reasons for the pain in EMA implementation, including the lack of a central joined up approach, and the fact that trying to implement EMA at scale exposes limitations in the technology.

Finally Lisa moved on to the excellent “assessment life cycle” model developed by Manchester Metropolitan University.  You can catch that, and their excellent suite of resources on assessment here.

Figure 2 -The assessment lifecycle


Image courtesy of Manchester Metropolitan University - reproduced under CC license (BY/SA)
Lisa explained how, by breaking down the components of assessment in this way, we can then map the main challenges of EMA into each area which highlights where in the cycle our EMA issues lie. As we might imagine a large portion of them reside in number 5, “Marking and Production of feedback”.

Workshop activities


For our first workshop activity we were asked to work in pairs or small groups. Working off a “challenge sheet” that outlined the main EMA challenges, we were tasked with  reviewing and ticking off the challenges that we thought were causing the most pain. We were then asked to join forces with a larger group to rank those challenges in order of importance (1 - most important, 5 - least important).

Having reviewed all the challenges our group thought that, although there were  no real level 1 (urgent and biggest impact) challenges, there were a couple of level 2 challenges that were the biggest hurdles:

  • Ability to handle a variety of typical UK marking and moderation workflows : We felt that this challenge encompassed a lot of the pain points in adoption of EMA. Certainly within our institution there are many local variations of workflows that, when you apply EMA, it highlights issues such as anonymous marking, double blind marking, and moderation of marks 

  •  Ability of systems to handle off-line marking: Currently, there is only really the Turnitin iPad app that can offer true off line marking. This is quite a limitation when we  consider that offline marking may well offer the biggest step forward in making EMA a solution, that makes marking at least as easy, if not easier than its paper equivalent.

The resulting group feedback to the room generated some really useful discussions around groups pain points, some of which was directed towards either the incompatibility between Institutional student systems and marking systems or the current inflexibility of the technology.

EMA Solutions and workshop 2

As part of the final workshop of the day we were asked to move into four groups to look at each of the projects, decide what detail we would like to see within them and then provide feedback to the room. I chose group 4: the EMA tool kit. You can view the EMA toolkit description and the other projects here.

EMA toolkit

Our group discussed the idea that such a resource should work reciprocally with local Institutional produced resources. For example at the University of Sheffield we have our recently devised TEL Hub  which is continually evolving.  We are looking to  build our own set of EMA resources within the Hub for our Institution, and the content that is within this resource should work in tandem with resources in a centrally devised hub. Whether this be policy, process guidance or case studies. Anything we develop locally could potentially feed back into the central hub.

In addition we felt that it would be important to for hub users to be able to utilise different views or “lenses” on the toolkit to encompass all the different stakeholders: academic staff, admin, learning technologist (or equivalent), and student.

Feedback and Finish

The groups provided feedback to the room regarding the separate projects and three institutions were given a brief moment (due to time restraints) to mention their EMA themed work .

Paul Dewsnap at Sheffield Hallam University: Great work on looking end to end at their assessment and feedback processes:

Joel Mills at the University of Hull: Their use of Sakai (ebridge) to ensure that e-submission from students is captured successfully even if Turnitin is encountering issues.

Phil Vincent from York St John University: Their EMA policy development, and in particular, describing how you can reduce staff resistance to EMA through the deployment of two monitors!

Sarah Copeland from the University of Bradford: The University policy that requires e-submission where practical and for electronic feedback to be given within 20 days. The Faculty of Health studies policy of anonymous e-marking and use of core technologies: Pebblepad, Blackboard and Turnitin for areas of EMA.

We would like to thank once again, our hosts St Johns University of York, for providing with us a fantastic location and to Lisa and Lynette for a fab workshop that has helped provide some much needed light at the end of a dark EMA tunnel!

If you wish to get involved with the White Rose Learning Technologists' Forum, you can subscribe to our mailing list:


Please also see our Google+ Community page  at the link below:

Hope to see you at the next meeting!



Related Posts Plugin for WordPress, Blogger...