« The Learning Design of MOOCs | Main | Design responses to MOOC completion rates »

12/12/2013

Comments

Jenny Mackness

Fascinating post and data. Thanks for sharing. A couple of immediate thoughts from me:

1. In terms of MOOCs, I wonder if thinking about completing in terms of gaining the certificate is appropriate. I say this because for one Coursera MOOC, I feel that I definitely completed it on my own terms, but not in their terms of what I needed to do to get the certificate. This is probably splitting hairs and I know you have to draw the line somewhere. But does 'completion' mean something different in MOOCs?

2. > a significant negative correlation, i.e. the more people who enrol then the lower the percentage who complete.
But aren't the numbers who complete in the higher enrolment MOOCs still higher than for smaller MOOCs and is this important or not?

3. As with the point above about the meaning of 'completion' - is 'dropping out' of a MOOC a problem? I'm not sure that it is. I have signed up for a few MOOCs that I have either not started or just cursorily looked at before leaving - but I have also completed a number of MOOCs right down to the very last reading and video. So I suppose my question is - why should we think about MOOC completion rates at all? Presumably all this will be in your paper - which I look forward to reading.

Dougclow

Wow, top stuff! And a fun (fsvo 'fun') discussion on Twitter too.

I think this is learning analytics in a nutshell: the numbers matter, but they are not all that matters.

So, for instance, the correlation seen above between the size of MOOCs and the completion rate might suggest that bigger is worse, not better. But it could be some third factor - and looking at Katy's MOOC data in the live version (http://www.katyjordan.com/MOOCproject.html) you can see that most of the data comes from Coursera and Open2Study. Open2Study account for most of the points on the left-hand side of that graph: small enrolment, very high completion. That suggests to me it's worth looking more closely at Open2Study and why it has high completion. It could simply be that their courses have smaller numbers and so more people complete. Or it could be that their courses are better than Coursera's. Or it could be something else, like they are calculating the baseline differently.

The other thing that has smacked me in the eyes is the course length vs completion graph. Now, the first thing to observe here is that you'd expect a clear negative correlation in this treatment of the data even if there were no real effect: if people drop out of a course at a certain rate time, the longer the course goes on, the more time they have to drop out, and so the lower the completion rate will be. (I think there are ways of analysing to control for this - will talk to you offline.)

But thinking about MOOCs and drop out vs climb out vs "I got what I wanted" reminded me of Clay Shirky on MP3s vs albums (http://www.shirky.com/writings/herecomeseverybody/napster_nyt.html):
"Most albums have only two or three songs that any given listener
likes, but the album format forces people to choose between paying for
a dozen mediocre songs to get those two or three, or not getting any
of the songs at all"

In his (in?)famous piece about MOOCs (http://www.shirky.com/weblog/2012/11/napster-udacity-and-the-academy/) he says "our MP3 is the massive open online course (or MOOC), and our Napster is Udacity, the education startup". I think he might be right about the second - particularly now Udacity's history has even stronger parallels to Napster's from a punter perspective. But I think he might be slightly wrong about the first: MOOCs are still very large chunks of learning. Reworking that album/song quote gives us:

"Most courses have only two or three activities that any given learner likes, but the course format forces people to choose between signing up for a dozen mediocre songs to get those two or three, or not getting any of the activities at all."

I have a horrible suspicion that the unit of learning that most people *want* is nearer the three minute single than the one-hour album. Never mind the 30-week course. (And I'm not meaning three minutes of learning metaphorically here.)

mweller

Hi Jenny, thanks for the comments. In terms of 1) completion and active users show the same pattern (active users being anyone who accesses a resource), so whether you consider completion in terms of certificate or active users doesn't really make any difference (there are lots of definitions of completion btw, and there is a graph for that which I didn't include).
Point 2 is interesting. I don't know - if we assume for now you want people to complete then is it better to have 2 x 10K enrolment courses or 1 x20k? I don't know. But if you do want people to complete then it seems that more focused courses may be better.
3) Yes, this is causing lots of angst over on twitter. I'm not saying completion is the only metric, or that people don't get what they want after 2 weeks. Just that here is the data as far as completion rates go, which I think help informs the debate. As I say, if completion is important then MOOCs probably aren't the ideal solution, BUT if completion isn't such a factor, then they may be a good approach.

Dougclow

Bah, messed up the replace - I meant:

"Most courses have only two or three activities that any given learner likes, but the course format forces people to choose between signing up for a dozen mediocre activities to get those two or three, or not getting any of the activities at all."

And actually most courses include way more than a dozen mediocre activities. (Where mediocre is by learner perception, not teacher. Obviously every single one of *our* activities are outstanding.)

Reedyreedles

Hi Martin,
Great post, and I'm sure everyone reading would like to thank Katy (and you) for sharing this data.

Firstly, how did she get access to this data? Is it readily available?

Secondly, I'm thinking about length of course Vs completion (or attrition).
I can't see why there would be a correlation between the two. Merely more numbers wouldn't cause someone to drop out, but it could be related to messiness of forums.... I'd suspect if you was overwhelmed by the forums though, you just wouldn't participate in them rather than dropping out, but hey the data don't lie! It would be interesting to investigate this further.

When you question whether 2x10k enrolments may or may not be better than 1x20k, I wonder/question/doubt the pattern is continuous - by that I mean it won't be true going on into 100k - 2m enrolments, etc. I suspect there'll be a plateau whereby the percentage of completions reaches a steady state regardless, so there'll be no significant difference in drop out rates between a MOOC with 500k enrolments to 2m enrolments.

Thirdly, just an observation. Looking on Katy's blog I don't think there is any correlation between assessment type and completion rates, which is quite surprising I think. I'd expect this to be a significant factor influencing attrition - http://www.katyjordan.com/MOOCproject.html

Anyway, thanks again.
Peter
@reedyreedles

Ghaff

"obviously the longer a course goes on, the more people will drop out"

I don't consider it THAT obvious although it's clearly the case based on the data. One could at least imagine that longer courses could be richer and offer a learning experience that better paralleled a university class. Or that some percentage of people will tend to stick with a class to completion up to some length beyond which it's just "too long." Instead the data suggests that it's more like a week-to-week thing (and makes a good case for shorter snippets--assuming that completion is a relevant metric).

mweller

@Peter - i take it you mean the enrolment figures, not length. My feeling on this is that when you get high enrolment numbers you get a lot of people signing up just for the sake of it. They're not very interested in the subject - this may have been an artifact of early MOOCs where people were taking them just to experience a MOOC. With smaller enrolment numbers I'm guessing you're getting more focused people. But as you say - more work needed. Re the assessment data I think there was significant variation between types (auto grading and peer, with auto being higher) but I didn't include that here, just to keep it to blog length.

@Ghaff - I think you're overthinking it :) the point is if you have a general background noise of factors that can cause dropout eg illness, family life, work commitments, then the longer a course goes on the greater the chance these will have of impacting upon someone. So if you run a course for longer, you get a general dropout factor. Just as the longer people live the more chance they have of dying! So, we need to separate out this general time effect from a causal one

JaySieling

Fascinating to see this data and begin to see some correlations and questions about completion rates in MOOCS. That is the current buzz. There are some interesting points of inquiry being revealed. The issue of course length is fascinating. I peek into Open2Study (it seems to be like Coursera-down-under) It appears most courses are 4 weeks. I was wondering what else would set them apart and indicate higher completion rates. One thing I've thought about, and I'm sure there will be some data forthcoming to support this: is there a correlation between higher completion rates and the value of the course outcome? In other words, if the certificate granted at the end is eligible for credit (or if students pay for the credit) would completion rates be higher?

We are at the end of our brick and mortar semester, and it is crunch time stress for students realizing they are not going to pass, or have missed assignments and won't get a good grade. Some just disappear and drop out (just like MOOCs). It frustrates me because I know they've paid tuition to be here, or receive grants or loans that now seem wasted. For some, the cost of the credits keeps them enrolled to the end, working hard or begging for mercy to finish with a passing grade. When there is no consequence, dropping is easier. Others have pointed this out.

One of the other things I discovered at Open2Study was a bit of "gamification". They offer a series of badges, or rewards for participating. You can get badges for completing your profile, for participating in the forums, for watching the videos. You can earn silver bronze and platinum levels. It is incentivized. I am wondering if that doesn't contribute to the noted higher completion rates. I know some facilities have begun to offer credit for Coursera courses, and Udacity has some pay options for accredited certificates. But could these simple 'badges' be enough to make a difference? "Badges, we don't need no stinkin' badges..." Maybe we do?

Simonrae

Thanks for publishing all this MOOC completion data (quite takes me back to days working with the Courses Survey data!), the charts and the breakdowns.

I'd like to see a breakdown of this data by prior qualifications, whether the student had completed school, university etc and at what level. I understand from Twitter-based readings that a high percentage of people registering for MOOCs already have Masters or above ... ?

Developing on from Doug's comments, is this evidence of people looking for a bit of professional development and 'raiding' MOOCs for just the bits they want (creating their own versions of a Father Guido Sarduccis Five Minute University http://www.youtube.com/watch?v=kO8x8eoU3L4 which teach what an average college graduate knows after five years from graduation in five minutes!)?

If this is the way students want to use MOOCs (rather than how providers think they should) will they need to be helped to do this by being provided with very precise indexes, content lists, timings etc (metadata?). (This is beginning to sound like Learning Objects all over again.)

mweller

@Jay - you make some good points. This data can't really answer them, it's more of a broad brush picture, but I would like to see more on this. We know at the OU that the more people pay then the less likely they are to drop out, but of course that defeats the object of MOOCs. But if people were studying MOOCs for a real-life outcome (say to get a job) I'm sure you'd see bigger retention. My next post is about design responses to completion data, which covers some of the points you raise.

@Simon - yes there is some evidence that way, I'm not sure we have it in this data though. I think there is a bit of raiding, but generally people drop out by week 3, they're not getting to later parts. I'll talk about this in the next post

Luis Ordoñez

Excellent, thank you!!!
Some time ago I asked stephen downes if I could have access to data on his MOOC's regarding cultural derived performance. In other words, completion data according to countries (catholic european, latinamerican, protestan european, english speaking world, and so on according to the world values survey). He did not accepted. So I got frustrated.
I wonder now, that you have such a great data, could you provide the detailed analysis?. It has to do with technology transfers and ways of handling information in different parts of the world (does mom's data is more to be trusted than a MOOC's data?)
Most of my work is in spanish bur I bet culture has something to do on how to handle the web. If you doubt ask the social psychologists!!!

CYPmedia

Fine post Martin and Katy.
I'm pleased to see open2study doing well. I've done one of their courses (www.open2study.com/courses/early-childhood-education) and enjoyed their manageable class sizes, integration of social media and gamification - their badge awards are surprisingly motivating.

I posted a brief comparison of the open2study course and its equivalent at Coursera here: http://cyp-media.org/2013/10/01/two-free-early-childhood-moocs-starting-soon

Tony

mweller

@Luis - we got the data from what was publicly available. I'm not sure it does give that kind of breakdown - as you'll see only 13 listed gender, so 1st language or location of student might be even rarer data, but location of MOOC would be different. There is a MOOC map http://edutechnica.com/moocmap/# so one could presumably try and find the open data for MOOCs listed here from different countries? I think, as you suggest, it would provide another interesting angle.

@Tony - yes, open2study do stand out. I think their shortness and perhaps the badging may well be significant. See my next post about design responses - getting people through those first 2-3 weeks seems important and badges may help here. Thanks for the link.

Bali_Maha

hi Martin - thanks for sharing Katy's work and yours (not sure which parts belong to whom) and it's very valuable and adds insight that is much needed.

Your last question is of course very important "does completion even matter?" and I think Jenny Mackness gave a pretty good response to that. Since MOOCs are not formal courses leading to certification, individual learner goals are more important than completion (I have an opposite experience to Jenny in that I got a certificate for completing a MOOC in which I did NOTHING but take the final exam; I was just trying it out and then was surprised to get a certificate)

As someone who comes from a much more interpretive paradigm, though, I wonder why we are not asking the more detailed questions of: what aspects of a MOOC's pedagogical design enhance learner engagement (not completion rates), and what kind of learner characteristics/motivation/experience results in more sustained engagement with which type of MOOC? (even within xMOOCs,each MOOC is different for many reasons). I realize all of these questions are not within the scope of your study or any large-scale quantitative study... but it sort of seems like something that would have been worth studying?
for example, reflecting on my own experience, I discovered I finish MOOCs better when they're within my field of interest (not something I'm tangentially interested in) - which actually makes sense because then the MOOC is a kind of free professional development.

On another note, the questions of 2-week MOOCs being like OERs... I don't consider them that way, even the recent NWOER week - you're the OER expert, but for me an OER is not time-bound or interactive in the way a MOOC is... what remains is maybe an OER out of it, but the week or two of interaction in it is not OER-like but more MOOC-like, if that makes sense?

The comments to this entry are closed.

Flickr

  • www.flickr.com
    This is a Flickr badge showing public photos and videos from edtechie99. Make your own badge here.

Twitter Updates

    follow me on Twitter