JISC e-Pedagogy Experts Meeting: Birmingham, October 2010
I went off to the JISC e-pedagogy experts meeting with a dual hat on, I am on the experts group but was also presenting with the cluster group from our JISC curriculum design project.
For those of you that don’t know, the e-pedagogy experts are a JISC run group that (according to the meeting today) are important because they:
- Help shape direction and outputs of JISC e-learning programme
- Provide feedback on funding calls eg gaming call and grants
- Enable JISC to consult on the design, delivery and innovation grants and programmes
- Provide opportunities for programmes to showcase work
I was thinking that it is a bit like a user/consultative group that JISC use to promote what they are up to as well as showcasing work of projects they have funded.
There was a round of current news, items of main interest to me (prompts for further action) were the fact that JISC are being reviewed by Hefce and experts can comment by November via jiscconsultation@hefce.ac.uk; new funding calls on creation of community content, meeting on 28th October with call closing on 10th December (work contributed to the Design Studio might be interested in this call); a new innovation in L&T call will be out in 2011 and the online conference at end of November
Gill Ferrell gave an overview of the curriculum design and delivery strand and the 27 projects funded under this activity. The original call was devised in Spring 2008 but now the political and financial climate is very different. For the design projects in particular, this has meant that projects have had to change outcomes and be more agile in their approach. The challenges are still relevant but the pressure has increased to demonstrate impact. Projects have also had to adapt to changes in senior management, engagement, organisational priorities and other contextual issues. Most projects have therefore reduced promotion of the project itself and sought to embed and connect it with other institutional initiatives. This is certainly true to the PREDICT project at City which we have not promoted as a project per se. Gill talked about “guerrilla” techniques/strategies that have been used by projects to achieve their outcomes just in more roundabout ways. The delivery projects have been practitioner focused and looked at celebrating achievements.
Now halfway through, the design projects are considering different ways to evaluate impact or provide evidence. This provided some questioning in the room about what is evidence and how can we measure it which was unresolved. Change management in large organisations is problematic and it is hard to pin specific evidence against some activities. There was a view that innovation may be cut with budget reductions and therefore needs to demonstrate value for money. I think we just need to be cannier about how we are demonstrating impact, have broad definitions of what counts as “evidence” and resist the desire to reduce everything to a tangible measure. That is not to say that we shouldn’t measure what we do, on the contrary we should ensure we are evaluating and recording so we can learn and change the future direction of education. However, this is not achieved (only?) but coloured, checked boxes on a spreadsheet.
It could be that in a climate of reduced funding learners may become more like customers and a vocabulary of producer-consumer applied to all learning which has considerable risks for engagement and behaviour. I have a problem, as probably did most of the people there, with this as learning is more sophisticated than this.
After our cluster group session which went well – we explored the mindmap we have created around the design process and I spoke about the slipperiness of vocabulary and the importance of considering design in the broadest sense not just around the approval event – I went to a session on the QA/QE SIG which is an HEA funded group who have created a toolkit and collected some case studies to facilitate the QA processes around blended and online learning. The toolkit raises a number of questions that are not normally asked but need to be considered for these “special” types of learning. The toolkit addresses areas of planning and delivery as well as evaluation. I think though that now we should be thinking of all learning as blended learning and asking these kinds of questions of all programmes.
Session from Wolverhampton on the introduction of netbooks in Schools was interesting. After being asked to do an evaluation they decided to do some action research to ascertain the effect of this on teachers and conceptions of independent learning. They did some research with children on how they used technology at home as well as their learning preferences and although most of the results were fairly standard (use of social networks etc) they found that children were not engaged in much content production so decided to explore this further with the teachers. This first phase was involved in creation of vision and futures. In the second phase they focused on outputs and looked at how netbooks could change teachers’ pedagogies and issues around implementation of the technology and functionality. In the third phase they evaluated this in three ways by looking at three areas – capacity for innovation, alignment of needs and concerns, status of technology (usability, coolness). This last area connected to the needs and concerns. They found that teachers often had low capacity for innovation as well as introducing independent learning raised concerns around control. The fourth phase then looked at reconceptualising the models of independent learning and how to plan this differently. Mediating factors were that the implementation of future vision which was positive as well as independent learning enabled a more creative curriculum. A big moderating factor was that teachers roles changed with this new pedagogy, they became more uncertain of their role and this led to concerns about how OFSTED would view independent learning in the curriculum. Children loved the netbooks and used them to create videos of how they benefit their learning. Also they behaved differently – those students who were reticent in traditional classroom settings were more engaged. Raised some interesting questions and insights into how technology challenges teacher perceptions as well as how auditing functions, such as OFSTED influence the pedagogic methods so strongly.
There was a JISC run session on their new guide about innovative practice to come out next year. It will focus on gaming, virtual worlds and social media. Obviously a lot has changed since the last guide of 2005 and there is a problem that if these guides are too technology focused they will date very quickly. We discussed the problem of using the term “innovation” as it seems to imply technology and something new whereas it could be something old in a new setting and totally unrelated to technology. In my view much better to focus around the reasons and values driving the change.
Last interesting session I went to was on using twitter with BSc and pre-Reg nursing students to foster interactivity at University of Glamorgan. Project started with four questions – what does good learning/teaching look like (does it include technology?) ?, what have been unable to do? What are the obstacles? Thought twitter could replace use of PRS and be more fun as well as encouraging collaboration which is an important skill for nursing students to have as they will need this in practice. Was some concern about twitter due to negative perceptions around narcissism and information overload. Two ways used twitter:
- Videos of mannequins use to practice procedures on produced by experts and then tweeted. Solved issues around student access to these as they only had 5
- Asked short questions via twitter and asked students to tweet answers
Found that twitter provided evidence of reflective learning via student tweets, learning by doing through engagement with the videos, learning as conversing through engagement with others on twitter. Students also began to explore the area through starting to follow other practitioners they had found on twitter as well as tweet relevant news stories. BSc students who used it asynchronously wanted it embedded with other social media and disliked world limit. Pre-reg liked the real-time interaction and have gone on to continue groups supporting each other via twitter.
Tried to write all my notes using smartwisdom all day then typed them up on the train. Realised that I write too many notes and could cut notes to salient points. Also wondering what point of notes is?! Think that it acts as an aide memoire to me, for example the twitter session will be useful for the session I am running with a colleague later in the year. Found it quite hard and time consuming but then would normally never type my notes up. Got lots of questions from people and some telling me to use the mind mapping tool via the ipad. Hmmm, think I need to rehearse that line why it isn’t like a mindmap more often. Will keep going though….
Just a quick plug – the QAQE SIG is inviting further consultation on the toolkit; there’s information available at http://qaqe-sig.net/?p=176. Assuming our plans for the year are approved by the HE Academy, there may also be some tiny grants available for the production of case studies of its used. Watch this space…
“Tried to write all my notes using smartwisdom all day then typed them up on the train. […] Got lots of questions from people and some telling me to use the mind mapping tool via the ipad. Hmmm, think I need to rehearse that line why it isn’t like a mindmap more often. Will keep going though….”
…because it costs more…?
😉
ha ha, hmmm no comment – you are my best comment-er 🙂