This week we've been populating the impact map for the OER Research Hub. The impact map (http://oermap.org/) has been developed largely by Rob Farrow and Martin Hawksey, and features lots of Hawksey-goodness. You can do the following on the map:
- Look at evidence for any one of our 11 hypotheses (eg. for hypothesis A regarding performance)
- Look at the flow of evidence
- Examine evidence by country
- Filter evidence by sector, polarity, hypothesis, country
- Explore the map for OER policies (a work in progress)
So, as well as putting our own evidence in there, we have been trying to add in the research of others that really demonstrates evidence for one of the hypotheses. And this has been an interesting exercise. I have been working through Rory McGreal's excellent resource OER Knowledge Cloud, going through papers and trying to add them in. The problem is very few OER papers actually give anything approaching proper evidence or research. Try it yourself, pick a few papers from the knowledge cloud at random. What you get are project reports about releasing OERs, lots of "lessons learned", a lot of beliefs stated as evidence eg "this will improve retention", quite a lot of download stats, but very little hard evidence that you could point at and say to someone "this supports (or negates) this hypothesis".
In some ways this is understandable - OERs had to be developed in order to do research on OERs. So the early phase of the field will always be partly driven by evangelism and implementation. But we've moved beyond that phase now, after more than 10 years of OERs. The field really needs to up its game in terms of research now and demonstrating impact and evidence. I think all OER projects should have a research strand built in that asks questions such as "what are the expected benefits of this work?", "how will we measure that?", "what happens if these aren't realised?" etc. (Our 11 hypotheses would be a good start for anyone).
I really believe in OERs, and I think in the early stage of their development you just needed to take a leap of faith and develop them. But they have reached a level of maturity now when we can ask tough questions of them, without fear of undermining the whole enterprise. Indeed, I think having such solid research to point to is essential for OERs to make that next push through into mainstream practice.
So if you've got any of this evidence lying around (and I do mean evidence, not something a bloke down the pub told you), please let us have it.
Comments