The 1st of December, notifications of acceptance and rejection were submitted to the authors. Before this was made possible, a trajectory of opening a call for proposals, waiting for the submissions, closing the call, divide the proposals among reviewers, waiting for the reviews and, finally, based on the reviews, per proposal reach a decision about acceptance or rejection has been walked.
After opening of the call in July, the 1st submission already came in on 20 July. We had set the deadline for submission on 23 October. On that day we had received 82 proposals, where we (based on previous OEGlobals) aimed at 120 proposals. We therefore decided to extend the deadline with one week to 30 October. The next graph shows you what happened in that week.
On 30 October, 66 submissions came in, with an additional 19 on 31 October. The pool of reviewers however was calculated on 120-130 submissions instead of the 202 we had received. This would mean that each reviewer had to review 13-14 proposals instead of the 6-8 I had asked them for. It was heartwarming to experience that the majority of reviewers did not bother the extra workload when I asked them. Even more heartwarming was that an extra 12 reviewers offered themselves voluntarily within 24 hours, after 1 tweet and some retweets! This showed me the value of a warm and strong open community.
With some late submissions we ended up with 207 proposals. After the reviews were submitted, the conference team discussed about the reviews, especially those proposals where the reviewers did not agree or where one or both reviewers had hesitations on acceptance or rejection. We kept in mind that the OEGlobal should reflect the Open Education Consortium: a global consortium with both experienced members and newcomers in the arena of open education, represented by policy makers, management and support staff, teachers, students, and researchers. So there should be a place for both proposals based on scientific research as for proposals about policy or practices, not per se based on scientific research.
In the end 151 proposals are accepted (73%) and 34 are rejected (16%). 22 proposals are accepted under conditions. They are asked to rework their proposal taking into account remarks by the reviewers. Depending on their reworked proposals, we can decide about their acceptance or rejection.
Here are some statistics about the 151 accepted proposals. First, the number of proposals per country of the corresponding author.
To get a better picture about globalness, the same information is displayed in a worldmap. The green areas represent the origin of accepted proposals.
Proposals are connected to specific tracks of the conference:
This picture reflects the large diversity of proposals received.
And a last statistic: authors of proposals could determine about the type of session:
What these statistics do not reflect is the high quality of many of the proposals. I am looking forward to a vibrant and interesting conference in April next year.