Moving on from my musings on social networking raising questions about the virtual spaces in which we meet students, I’ve now been to some sessions on learning analytics and learning design which have also raised interesting and semi-related topics.
J. Egan’s session on learning analytics provided a critique of the use of LMS data to measure student engagement, or in fact anything meaningful. He argues that “real” engagement and learning does not take place in a pdf download but in the myriad of interactions in the classroom between academics and students. We cannot reduce learning to datasets of access or downloads that do not take into account the complexity of the learning process and these physical interactions. Whilst in fully online courses, there is a stronger argument for data demonstrating what is happening with students, his basic premise is that the evangelical arguments for the benefits of learning analytics should be treated, at best, with caution.
Although, I am supportive of learning analytics, I thought this paper raised some good points. Learning is not a process that can be reduced to the number of times that someone logged into a VLE. Interestingly, Egan, noted that many research intensive universities are pursuing learning analytics because it promises a model for learning in a comparable way to measuring research that has hitherto eluded us. However, my interest in learning analytics has always been about how we can use such information to understand and improve learning design, not as a tool to measure learning engagement per se. We should be cautious of anything that sets itself up as a panacea and there is a danger with learning analytics that we see it as the golden bullet to solve our retention, progression and engagement challenges. And, if, as I argued in my other post, engagement with the VLE or LMS or other institutional media is partial by students as the use other tools to learn too, this will only lead to a partial understanding of what learning is taking place and where.
My interest in learning analytics is to uncover the complexity of learning and how we, as educators, can attempt to use this, in partnership with qualitative tools, to engage students in a dialogue about their learning experience. This is messy, though, and hard and not easily reduced to nice data tables. I guess, as with my work on measuring impact and KPIs, although learning analytics might be able to give us a sense of “what” is students are doing, it does not tell us “why”.
The challenges of good design for educational technologies was highlighted in J. Heppen’s session on designing educational tools. So often, he argued, designers do not work with educators and so software falls short of engaging students. I am minded of some of the educational tools that my children are using at school. The only word to describe them is “lame”. When compared to the games my children might engage with the tools that the school is encouraging fall massively short. Dull, unintuitive, patronising and lacking in creativity at best, and discouraging learning at worst. Yet it is not the school’s fault. On paper these tools look like they meet various learning objectives – students can track their work, they are timed etc etc – but in reality they fail to engage. Thinking back to the critique on learning analytics, if we use data from these systems to try to ascertain what learning is happening we will fail. Design needs to go hand in hand with academics so we develop tools that will support our students and inspire them.
This leads neatly back once again to the spaces in which we are working with our students and learners. There has been much interest about the flipped classroom and how technology can enable change. BUT (and it is a big BUT) as Z. Charlesworth argued in her session on preparing next generation educators, we are not leveraging the potentials of technology at the moment. We have seen this just in the examples I have given above. There are massive gaps in the potentials of technology and the reality – in the actual systems offered, skills of staff and students and the learning environments we are working in.
Charlesworth’s answer to this is to work with academics to make them change agents, using Lewin’s change model as a framework. Academics need to develop a viral approach to change and “infect” others with their enthusiasm for technology and learning. However, all change will be muted if there is not an institutional culture that actively supports and promotes this need to change and develop new pedagogies. “Walking the talk” and giving real examples are all vital here.
I like the segue here from Facebook to VLEs to analytics to design to leadership. Leadership in education is so important for engaging staff and students to take risks and embrace new models of learning. Providing the “right” – supportive, flexible and responsive physical and virtual environments – spaces to enable these conversations and explorations to take place is vital. And I’ll be talking about how to develop academics as leaders tomorrow. What beautiful synergy!