A key component of DaMSSI-ABC is identifying an appropriate assessment framework for research data management training materials, so that courses can be effectively found and compared. Our first step in this area is to trial the Research Information and Digital Literacies Coalition (RIDLs) draft criteria for good practice in describing, reviewing and assessing practice in information literacy training.
RIDLs was established in 2012 as a follow-on from the Research Information Network’s Working Group on Information Handling, and is a coalition of partners working together to promote the value of information and research data literacy for academic researchers. More information on the coalition can be found at http://www.researchinfonet.org/infolit/ridls/. One of RIDLs’ core activities focuses on testing, refining and disseminating the criteria for good practice in describing, reviewing and assessing training courses and other interventions in the area of information literacy. The criteria relate to all types of interventions aimed at developing researchers’ information-handling knowledge, skills and competencies, whether in the form of face-to-face sessions/courses or digital/online resources. They serve two broad purposes:
(i) Helping those who design and deliver courses and learning resources to ensure they are developing materials that are fit for purpose as well as a means of describing them. The criteria might provide a means of enabling a structured and recognised way of presenting such interventions through online portals such as Jorum.
(ii) Providing a simple means of assessing whether a particular course or resource is the right option and the right time. The criteria may enable the value of courses and resources to be better assessed by potential course participants as well as by other training providers who may be seeking to reuse the materials.
DaMSSI-ABC, working alongside RIDLs, have developed the proposed criteria into an easy-to-complete checklist (this can be found at http://www.dcc.ac.uk/training/damssi-abc), and are currently trialling the checklist with the Jisc RDMTrain projects and a number of other curation training providers in order to identify the usefulness of such a framework in a practical data management setting. RIDLs are also testing the criteria in its original format for a number of other information literacy projects and initiatives in over 20 institutions across the UK. Feedback so far has been very positive and the checklist format has ensured using the criteria is a quicker and simpler process.
It is hoped that the end result for the purposes of DaMSSI-ABC will be a set of criteria specifically adapted for the self-assessment, description and evaluation of data management training. We also envisage elements of the criteria being used as part of course description fields in the upcoming Jorum research data management ‘window’, which will be completed later in the year.
Next up, DaMSSI-ABC will be looking into benchmarking training courses and learning resources to help identify possible pathways through the content. This will allow course participants and training providers to better plan for progressive skills development. Watch this space for more updates.