Tuesday, 14 September 2010

LMS data vs. staff surveys: tailoring staff development

Presentation by Karin Barac and Nicole Wall, Blended Learning Advisors, Griffith University comparing data obtained from staff surveys and Blackboard (LMS data) to identify actual usage of Blackboard and inform staff development.

It was an interesting session with which we can easily resonate. To put the session into context Karin and Nicole have been employed alongside two other Blended Learning Advisors to each work in one of Griffith’s four faculties. There are also Curriculum Advisor roles in each faculty. Griffith are focussing on decentralised support from the Griffith Institute of Higher Education - it is seen as a hybrid sustainable model allowing for faculty specific strategies, bridging the cultural gap (between ‘them and us’), and a holistic approach to blended learning. Griffith are also introducing a Blended Learning Implementation Plan.

This presentation focussed on the benchmarking aspect of the implementation plan. Data on LMS data was provided by Learning@Griffith (ie Bb Admin) and a staff development survey in each faculty. The staff development survey asked staff what tools and topics they were interested in and the timing of staff development sessions (it was felt that centrally organised sessions were organised at times that did not suit faculties). The survey questions were the same, but delivery method varies in each faculty which are interesting to note; e.g. Science, Environment, Engineering, Technology used Survey Monkey and got a low return, Arts, Education and Law used a paper survey with a ‘candy bribe’ and got a similarly low return; Health handed them out during a program retreat and got a complete return.

Headline findings

Preferred mode of training:

  • Demonstrations, show’n’tell, and group workshops rate highly, but they get low attendance at actual sessions
  • One to one, just in time and self-paced online are very effective
  • Web seminars are rated ineffective

Interest in training area topic:

  • There is less interest in the introduction to e-learning session, even though the LMS data shows low uptake in a number of areas
  • However all other areas (incl. advance use, effective course design, student engagement – blogs, wikis – online assessment and feedback, using media, recording and broadcasting, and emerging technology)show very or somewhat interested

Tool based question:

  • With the exception of Adaptive Release showing a high rate of disinterest which may be the result of staff not understanding its function, staff showed a high level of interest in all other tools (i.e. discussion board, blogs, Wimba, Content Collection, podcasting, Wimba, Lecture capture, SafeAssign, quizzes and surveys, and Grade Centre)

LMS usage data: simply provides a count on content tools, communication tools, assessment, and learner support tools. Identify courses that don’t use Blackboard and sites with empty folders and no announcements.

I haven’t reported on the enrolment data as inconsistencies reported are due to Griffith setting up Blackboard courses for all modules automatically – so they may have sites with students enrolled but no staff. Though they found a direct correlation between enrolment count and item/tools used in a site.

They claim the data provides greater evidence to identify the level of engagement, at risk courses, and its value is when correlated with other data (retention, fail rates, student evaluations). Further work includes formalising benchmarks for blended learning implementation plans and acknowledging student perspective which may include getting students to audit courses.

I have a copy of the presentation.

No comments:

Post a Comment