Monthly Archives: March 2013

RDM Training session at the Jisc Managing Research Data Programme workshop, Birmingham, 25 March

The RDM Training session offered discussion around the findings and experiences of the JISC RDM Training projects and the wider RDM training landscape in Europe.

The first talk was by Andrew Cox from the RDMRose project (http://rdmrose.group.shef.ac.uk/), which has produced Open Educational Resources for teaching RDM to students and information professionals. Librarians can potentially contribute to a number of areas in research data management, yet the results of a survey run by the project found a key theme to be a lack of skills in the workforce. Librarians need more confidence in demystifying the topic. To illustrate the point, Andrew provided entertaining examples of icebreakers used in their training sessions, where participants were asked ‘What would RDM be if it were an animal or a movie?’, and some of the amusing and terrified responses they had received.

The RDMRose learning approach acts on the premise that many librarians do not have personal experience of research, so the courses help participants understand the researcher’s perspective alongside that of other relevant support services. Practical exercises include building an RDM guidance website, conducting researcher interviews and reflective writing. Version 2 of the training materials will be released in April 2013, and a workshop will follow in May.

The second presentation was delivered by the TRaD project. The aim of TRaD is to embed good research data management practice at the University of East London, in the form of training for students and a course for liaison librarians. Two introductory RDM courses were delivered to professional doctoral psychology students, the initial one to students in their first year and the second to more advanced students working in the field. Both courses struggled to get participants to carry out work requested after the course, namely completion of the Jisc Research Data MANTRA (http://datalib.edina.ac.uk/mantra/) modules and feedback on the course. Even the offer of a prize failed to gain compliance!

The project found that the cohort already working professionally had many practical examples they could provide of RDM issues, and this made for a more interested and engaged audience. The first year students showed less interest and may have benefitted from the topic being included within a larger research methods course instead. TraD also offers another course, aimed primarily at subject librarians but with the scope to raise awareness amongst other librarians and IT services. This uses a blended learning approach with an introductory meeting followed by Moodle modules.

Jo Goodger from the University of Hertfordshire delivered the next talk, looking at their project RDMTPA on RDM training for astronomy and physics PGs (http://research-data-toolkit.herts.ac.uk/category/training/). The initial aim of the project had been to produce DMP online templates, a website and face-to-face training, but after interviews and running a training course the project decided to expand their approach into other departments. They found that although the types of data dealt with by other disciplines varied, many had the same issues and would all benefit from generic RDM training with best practice guidance. Jo echoed the comments of some of the projects in the RDM Support and Guidance session earlier that day, in that universities often have strict website templates in place which can be restrictive to the development of guidance webpages. She also highlighted that scientists in physics and astronomy were particularly concerned about media manipulation of their data, resulting in some reluctance to share. Good RDM fits this current need for promoting data well to the media.

The next two talks offered international perspectives on the topic. Laurence Horton of the Archives and Data Management Training Center at GESIS, Leibinz Institute for Social Science in Germany described their training resources and events which bore many similarities to those being created in the UK. Laurence highlighted two of the big issues they face. Firstly, German funding bodies, whilst recommending RDM, currently have no mandate for projects to share data, which means the focus on engaging researchers in RDM needs to be through good science practice. Secondly, European integration means that they have difficulty creating generic training courses in areas such as intellectual property and personal data where each country has different legislation. In the questions that followed the talk it was highlighted that the good practice argument is still important in theUK, especially as institutions begin to produce data management policies for non-funded research.

The University of Amsterdam, the largest university in the Netherlands, is now feeling an urge to take action in improving RDM. Key challenges in training their librarians include identifying a suitable length for courses, considering workload, engaging those librarians who work alone, and providing updates after initial training is delivered. Reflections after the talk noted that incentives to attract support staff are often focused on less than for researchers.

The final short talk was presented by Sarah Jones of the DCC on behalf of the University of Edinburgh, to describe their DIY Research Data Management Training Kit for Librarians (http://datalib.edina.ac.uk/mantra/libtraining.html). This five-module course aims to build knowledge and confidence amongst librarians, re-using MANTRA online training resources alongside face-to-face sessions with guest speakers and reflective writing. Again the themes of building confidence and putting librarians in researchers’ shoes were clearly identified as necessary and successful approaches.

The discussion which followed focused on a number of issues, centered around what more needs to be done in the area of RDM training. The projects all agreed that discipline-specific examples continued to be useful and it was important to share these more widely to make courses relevant and encourage take-up. There was also consensus that the issue of sustainability needs further consideration. It was commented that the active, middle phase of the RDM cycle is not the role of the librarian and this is an area where projects struggle to find suitable resources.  There was also interest in RDM training for IT staff; whilst there is acceptance that this is necessary there seems to be little focus on the design of tailored courses. It was agreed that less work has been done in this area and that more is needed to get these other services involved as RDM becomes more institutionalised. Finally, one attendee asked if there was any one place which brings together all these training resources so they can be compared and the most relevant courses selected. Chair Joy Davidson highlighted the current work of DaMSSI-ABC (http://www.dcc.ac.uk/training/damssi-abc) which is looking to build a Jorum ‘window’ for RDM and identify suitable ways to classify and benchmark materials.

What are your thoughts on the progress of RDM training? Are there areas or audiences which you think still need more attention? Do you have any suggestions for how courses should be classified so they can be compared effectively? Send us your comments!

Describing and assessing research data management training: DaMSSI-ABC and the RIDLs draft criteria checklist

A key component of DaMSSI-ABC is identifying an appropriate assessment framework for research data management training materials, so that courses can be effectively found and compared. Our first step in this area is to trial the Research Information and Digital Literacies Coalition (RIDLs) draft criteria for good practice in describing, reviewing and assessing practice in information literacy training.

RIDLs was established in 2012 as a follow-on from the Research Information Network’s Working Group on Information Handling, and is a coalition of partners working together to promote the value of information and research data literacy for academic researchers. More information on the coalition can be found at http://www.researchinfonet.org/infolit/ridls/. One of RIDLs’ core activities focuses on testing, refining and disseminating the criteria for good practice in describing, reviewing and assessing training courses and other interventions in the area of information literacy. The criteria relate to all types of interventions aimed at developing researchers’ information-handling knowledge, skills and competencies, whether in the form of face-to-face sessions/courses or digital/online resources. They serve two broad purposes:

(i)     Helping those who design and deliver courses and learning resources to ensure they are developing materials that are fit for purpose as well as a means of describing them. The criteria might provide a means of enabling a structured and recognised way of presenting such interventions through online portals such as Jorum.

(ii)   Providing a simple means of assessing whether a particular course or resource is the right option and the right time. The criteria may enable the value of courses and resources to be better assessed by potential course participants as well as by other training providers who may be seeking to reuse the materials.

DaMSSI-ABC, working alongside RIDLs, have developed the proposed criteria into an easy-to-complete checklist (this can be found at http://www.dcc.ac.uk/training/damssi-abc), and are currently trialling the checklist with the Jisc RDMTrain projects and a number of other curation training providers in order to identify the usefulness of such a framework in a practical data management setting. RIDLs are also testing the criteria in its original format for a number of other information literacy projects and initiatives in over 20 institutions across the UK. Feedback so far has been very positive and the checklist format has ensured using the criteria is a quicker and simpler process.

It is hoped that the end result for the purposes of DaMSSI-ABC will be a set of criteria specifically adapted for the self-assessment, description and evaluation of data management training. We also envisage elements of the criteria being used as part of course description fields in the upcoming Jorum research data management ‘window’, which will be completed later in the year.

Next up, DaMSSI-ABC will be looking into benchmarking training courses and learning resources to help identify possible pathways through the content. This will allow course participants and training providers to better plan for progressive skills development. Watch this space for more updates.