Key Performance Indicators for OER

One of the things I’ll be looking into as part of my new role is key performance indicators for open educational resources.  At the University of Edinburgh we have a Vision and Policy for OER that encourages staff and students to use, create and publish OERs to enhance the quality of the student experience, enrich the University and the sector, showcase the highest quality learning and teaching, and make a significant collection of unique learning materials available to Scotland and the world.

Staff and students at the university are already making open educational resources available through a range of channels including Open.Ed, Media Hopper, TES, SketchFab, Youtube, Wikimedia Commons and Wikipedia, and there are a number of initiatives ongoing that promote and support the creation of OER including 23Things, Board Game Jam, various MOOC projects, our Wikimedian in Residence programme and others.

So how do we develop meaningful key performance indicators to measure and assess the success of these initiatives?

Quantitative indicators are relatively simple to measure in terms of OER produced. It’s not difficult to gather web stats for page views and downloads from the various platforms used to host and disseminate our OERs.  For example our open educational resources on TES have been viewed over 2,000 times, and downloaded 934 times, a Wikipedia article on Mary Susan MacIntosh, created during a UoE editathon for International Women’s Day has had 9,030 page views, and UoE MOOCs have reached two and a quarter million learners.

Measuring OER reuse, even within the institution, is much less straightforward.  To get an of idea of where and how OERs are being reused you need to track the resources. This isn’t necessarily difficult to do, Cetis did some research on technical approaches for OER tracking during the UKOER Programme, but it does raise some interesting ethical issues.  We also discovered during our UKOER research that once authors create OER and release them into the wild, they tend not to be motivated to collect data on their reuse, even when actively encouraged to do so.

There is also the issue of what actually constitutes re-use.  Often reuse isn’t as straightforward as taking an OER, adapting is and incorporating it into your course materials.  Reuse is often more subtle than that.  For example, if you are inspired by an idea, a concept or an activity you ome across in an OER, but you don’t actually download and use the resource itself, does that constitute reuse?  And if it does, how do we create KPIs to measure such reuse?  Can it even be measured in a meaningful way?

And then there’s the issue of qualitative indicators and measuring impact.  How do we assess whether our OERs really are enhancing the quality of the student experience and enriching the University and the sector?  One way to gather qualitative information is to go out and talk to people and we already have some great testimonies from UoE students who have engaged with UoE OER internships and Wikimedia in the Classroom projects. Another way to measure impact is to look beyond the institution, so for example 23 Things lornwas awarded the LILAC Credo Digital Literacy Award 2017 and has also been adapted and adopted by the Scottish Social Services Council, and the aforementioned article on Mary Susan McIntosh featured on the front page of English Wikipedia.

I know many other institutions and organisations have grappled with the issue of how to measure the impact of open education and OER.  In the US, where OER often equates to open textbooks, the focus tends to be on cost savings for students, however this is not a particularly useful measure in UK HE where course are less reliant on astronomically priced texbooks.  So what indicators can we use to measure OER performance?  I’d be really interested to hear how other people have approached this challenge, so if you have any comments or suggestions please do let me know.  Thanks!

Standard Measures, CC BY SA 2.0, Neil Cummings, https://flic.kr/p/aH8CPV