Special Issue on MMDL (Multimodal Data for Learning) Journal of Computer Assisted Learning (JCAL) Impact factor: 1.679




Special Issue on MMDL (Multimodal Data for Learning) Journal of Computer Assisted Learning (JCAL) Impact factor: 1.679



Deadline of submissions: 08 May 2017





The transition of our world from an analog to a digital one affects all aspects of the society and also the educational sector. In recent years we have gained insights into the learning behaviour by investigating existing data sources such as learning management systems, mobile applications, and social media environments with analytic methods.


While these data sources can still provide a rich ground for research, a new wave of technological innovations is taking place with the Internet of Things (IoT) and the maker movement. IoT devices provide new applications and affordances for everyday life. Wearables, eye-trackers and other camera systems, self-programmable microcomputers such as Raspberry Pi and Arduino create new data sources which can be used to investigate learning. These new data sources are creating so-called multimodal datasets as they combine different data sources from physical activities, physiological responses with more traditional learning data. Alternative to traditional learning data collections, multimodal data sets require manifold data collection methods to combine the diverse data streams.


The new multimodal data research approaches promise to provide a more holistic picture about learners and the success factors for learning. But multimodal data is much more diverse and heterogeneous than data available from traditional learning environments. It is challenging to combine various data types such as text, assessments, activities, physiological data, and video for research purposes and gaining meaningful results.


In the nature of the JCAL journal, we are interested in empirical studies that take advantage of multimodal data sources to enrich or investigate learning and teaching. We therefore explicitly look for research that can show effects of multimodal data on learning and teaching sciences. Also, literature studies and results of technology infrastructures with new insights are invited for this call.




Relevant topics include, but are not limited to:


*   multimodal representation of learning

*   multimodal learning behaviour modeling

*   real-time data collection

*   multimodal data mining technologies

*   multimodal data interpretation

*   open data sources

*   supporting feedback and reflection with multimodal data

*   multimodal data learning analytics

*   wearable computing for learning

*   new educational approaches with/for multimodal learning




According to the covered main subjects in the content, a selected set of reviewers with the appropriate expertise in learning analytics, multimodal data, technology enhanced learning, and e-learning/computer science will be assigned.





*   Authors are invited to submit original unpublished research as papers. All submitted papers will be peer-reviewed for originality, significance, clarity, and quality.

*   The journal operates double blind peer review, so the authors must provide their title page as a separate file from their main document. Title page includes the complete title of the paper, affiliation and contact details for the corresponding author (both postal address and email address).

*   Practitioner Notes – section outlining – in bullet point form – what is currently known about the subject matter, what their paper adds to this, and finally the implications of study findings for practitioners. This should be no more than four bullet points per section of approximately 80 characters in order to maintain clarity.

*   Manuscripts should be submitted electronically via the online submission site http://mc.manuscriptcentral.com/jcal.

*   Maximum characters is 8000 words and abstract 200 words

*   A template can be found here: http://mc.manuscriptcentral.com/societyimages/jcal/APA6%20template.dotx

*   Further information about submission and layout can be fund here: http://onlinelibrary.wiley.com/journal/10.1111/(ISSN)1365-2729/homepage/ForAuthors.html





*   Submission of manuscripts: 08 May 2017

*   Completion of first review: 30 June 2017

*   Submission of revised manuscripts: 30 July 2017

*   Final decision notification: 01 September 2017

*   Copy Editing Version: 30 September 2017

*   Publication date: 10 November 2017


Any questions should be sent to: hendrik.drachsler@ou.nl<mailto:hendrik.drachsler@ou.nl>