Search This Blog

Loading...

Monday, 24 November 2014

Conference Review: SEDA 19th Annual Conference, 13-14th November 2014


Last week I attended the 19th Annual SEDA Conference. SEDA is the Staff and Educational Development Association, and the attendees included staff working directly in educational development, learning technologists, enthusiastic teachers and other support and academic related staff.

The theme of this year's conference was "Opportunities and challenges for academic development in a post-digital age". From my discussions with other delegates I sense that this was perhaps the first time that this conference had pitched itself fairly and squarely at issues relating to educational technology.

The opening speaker Grainne Conole did a brilliant job of mapping the history of the development of educational technology and particularly innovations such as MOOCs and Open Educational Resources. It was reassuring to be in the presence of both a speaker and an audience who feel positively about these topics and have a more optimistic mindset towards them.

There was a fantastic choice of parallel sessions, and the first I chose to attend was by Kathryn James. The focus was on a topic related to her PhD work, "Rhetoric and reality: The drive of learning technology and its implications for academic development". Catherine talked through the tensions and pressure points that surround academic use of learning technologies, and allowed us to chat on our tables- I was sat with a really nice group we had a really interesting discussion about the tools we use and the challenges we face in engaging academic staff with them- 'a trouble shared..' and all that!

The next session I chose was by Rebecca Dearden and was entitled "Working in a "third space" to create an institutional framework to underpin use of audio and video".  Rebecca talked about a project she had led which looked at the legal and privacy issues related to lecture-capture at Leeds University. She gave a really interesting presentation that showed how serendipity, willingness to work in all kinds of physical and virtual spaces, and teamwork can really pay off and deliver results in moving forward institutional policy.

After lunch, my third parallel session was entitled "A "menu" of teaching approaches to transform engagement with technology-enhanced learning", by Stuart Hepplestone and Ian Glover of Sheffield Hallam University. They discussed how they had developed a menu tool for helping academic staff to identify appropriate technologies to embed in their learning in teaching and access case studies demonstrating how these technologies can be effectively used. The session also demonstrated a kind of "diagnostic" tool that allows academic staff to identify where within their existing learning and teaching activity the opportunities to use technology might lie.

Finally, after a networking break, it was time for my workshop on using audio to deliver feedback. I was worried that the topic might not appeal, but I needn't have done, as around 20 people attended my session. I demonstrated how I have used audio and what I have found to be the benefits and advantages for both myself and my students. We also looked at the many free and low-cost tools available to deliver audio feedback, and the risks and challenges of maintaining security and privacy for students when sharing and storing audio feedback.

Many of the attendees had attempted audio feedback themselves and those that had were really enthusiastic. Those that hadn't seem to leave full of enthusiasm for the subject, and three people approached me afterwards to say they would definitely try audio feedback in the future. My slideshow presentation is above.

Overall I really enjoyed the conference, and ended up wishing I had booked for both days. I will next year!

Posted by Claire


Wednesday, 19 November 2014

Spot On Conference - Altmetrics and other Fads (or how I learned how to stop worrying about trains and love Skype)

screenshot-twitter.com 2014-11-19 16-47-07.png

Last week I was lucky enough to be invited to deliver a talk at the Nature Publishing hosted Spot On Conference at The Wellcome Trust in London to talk about my experience of using technologies to help academics communicate and build their scholarly profiles. I was delivering a short presentation as part of the parallel session ‘Measuring social impact - the tools available and whose responsibility is it?’.

I was looking forward to speaking as it was an opportunity to meet in person my journalist collaborator from my recent The Conversation piece on post-publication peer review, Akshat Rathi, as well as Charlie Rapple from Kudos and Jean Lui from Altmetric.com.

Sadly it was not to be, as I left my home in Chesterfield at 7.30am, little did I know that I would be back in my conservatory three and a half hours later. In the end it turns out to be a good experiment into how technology can resolve such problems as a serious signals failure at Derby Train Station. As my train was cancelled at this important network junction and with it every other train heading north and south I made the executive decision to head back home and hopefully make my allotted slot at 11am.

Firstly Twitter came to my aid as I Tweeted that I was unlikely to make it to the conference, to which I got an immediate reply from the organisers. I was able to request a direct message on Twitter from my contact at Spot On requesting their mobile number, which I got within minutes and from there I was able to put them in the picture as to my predicament. My only option was to head north-west to Matlock via a train that stopped at a multitude of small Derbyshire villages. I rang my wife asking that she would pick me up from Matlock and take me home- it all felt very Jack Bauer from the series 24 (well almost). My decision to return home was justified when I spoke later to a colleague who had pushed onto London via Euston and arrived at 12, meaning I would have missed my presentation slot. Years of train commuting has taught me when to stick or twist.

I got back for just after 11 and hooked up with Spot On via Google Hangout and was given time at the end to present. This brought about another opportunity as my slides were being delivered on Google Drive - I had long ditched Powerpoint and USBs for presentations about four years ago and it paid off. Given my talk was about technology and aiding researchers to go about their work more productively it allowed me to update the slides on the fly. That morning I had received an email from an editor of the peer-reviewed journal asking me to write a paper for a special issue based on a blog post I had written. So this was last minute proof on how the system can be flipped via the use of social technologies. It also highlighted how presentations can be amended to reflect last minute change without the good old USB stick. It reminded me also that sometimes wifi can throw in the occasional spanner to these techniques, but it should not always stop you. I remember presenting at CILIP back in 2012 using my Chromebook to present something in Google Drive when they had a powercut and the wifi went down. This could have spelt the end for my presentation, but I remembered seeing someone set up a wifi hotspot using their phone and hey presto! the show was back on the road.

So back to Spot On, another glitch appeared when I was told via Hangout Chat that the presenter’s laptop would not be able to host it on the screen, so we instantly switched to Skype. I was then able to appear on the screen and deliver my slides remotely. As for the presentation it was titled ‘Altmetrics and Other Fads -  Helping Researchers Through the Social, Technology and Innovation Maze’. Despite the rather flippant title it was not intended as such but highlights the constant change of technology and how many in academia are wary to engage with those at the cutting edge. This is understandable as technologies come and go and all too often they don’t always do what you want. The idea behind the presentation was that researchers and students need guidance in discovering, learning and using many new technologies. If teachers have a pedagogy as their reason to engage with technology, what do researchers have?
It was a shame that I was unable to attend Spot On as it looked a very engaging two day conference, you can view the Twitter feed here for more information https://twitter.com/hashtag/solo14

Here are my slides from the conference.




Friday, 31 October 2014

I only come here for the comments - Websites that allow researchers to review and comment on other’s work

3906743418_b63527bc8d_b (1).jpg
Image © CC BY NASA Goddard Space Flight Centre http://bit.ly/1wPSD0m


The idea of public post-publication peer review of journal articles would have been considered heresy just a couple of years ago, but in recent years there has been a growth in post-publication reviewing options. The following blog post looks at a few of them and discusses the idea of leaving post-publication reviews and whether its all as bad as some in the academy community seem to think it is.


The journal publishing model has been quite consistent for many years now, authors submit their paper to a peer-reviewed journal. Reviewers read and analyse it and give feedback, and depending on how good or bad that feedback is the paper is eventually accepted or dismissed for a variety of reasons, which may lead the authors to attempt a resubmission to another journal. For the papers that do get published, that often ends the cycle of conversation between peers unless the research is discussed at an event, such as a conference or appears in the media, where comments can be easily left - which are sometimes scathing, unjustified or plain unhelpful. And there is the rub, when does post-publication peer review become post-publication comment and how different are they?


Unlike many other things that appear on the Web, such as music, film, art and review, where comments and critique is normal and helps others make better informed judgements on what to consume; academic articles only appear above the surface for review and critique when it is used as part of someone’s research or teaching. Often there is no critique, good or bad of the research post-publication, just an important line, conclusion or otherwise to help build the hypothesis of new research; agreeing with or against previous works. To say there was no post-publication review system previously would be untrue as the likes of BMJ editors and others have accepted e-letters and Rapid Responses about research published in their journals, and blogging and social media have more recently offered platforms for researchers to discuss other’s works.


From my experience many researchers feel uncomfortable on speaking about other people’s research, which is understandable. Nobody wants to hear ill of their own hard work, as this is what they are potentially opening themselves up to. Take for example the reviews on such as YouTube or the Daily Mail website, often comments can become personal, malicious and quite damning. The issue is that everyone has an opinion and that can be on everything from Syria to fracking, and the Web has facilitated that opinion culture to the point where ‘trolling’ is now an acknowledged and serious problem. Yet academic publishing is different, certainly academics are more than culpable for their barbed comments, but making unjustified ones online will help no one, especially in the advancement of knowledge via discourse.


The journal publishing model has been criticised for being out of touch with modern publishing, and rightly so; in that a piece of research which can take years to complete can then take nearly as long to get published. So by that time things may have moved on in that topic of research, new methods, technologies and ideas may have surfaced. Post-publication reviews can help highlight this, and also may make researchers aware of potential future collaborators or similar research being undertaken.


For post-publication review to really be productive it has to be open, unlike sites such as YouTube which has allowed aliases and therefore trolls to flourish. Obviously not every piece of research commands a post-publication review and given the figures which range from about 12% to 90% of papers not being cited, it is pretty likely that not all papers will get reviews or at least have the mechanisms to be reviewed. We also have to remember that while some areas of research are less reliant on the journal publishing model, this does not mean post-publication review is not for them, in disciplines such as the humanities it may have just as much use.
Academic debate using the many Web 2.0 and social media tools freely available has only been embraced by a small percentage of academics. Interesting papers are more likely just shared using such as Twitter, Google+ and LinkedIn than discussed, but considering that it is far easier and less time-consuming just to share content on the web than review it, it is understandable. Reviewing takes time and unlike reviewing a film, which is foremost a subjective piece of writing and focuses on whether you enjoyed it or not and whether it was well made, peer-review requires more considered thought.  Research is measured on whether it was well designed and conducted, not how well it was written (although that does come into the formal review process - but more about whether it is understandable, not just using long words to impress). That said I will cover JOVE below which helps aid that second part of the review process.


The debate on whether is the best way forward for post-publication review will continue and like other topics such as measurement of research, there appears to be no ‘silver bullet’. Instead there is a collection of sites and tools operating in silos, all offering to solve a problem, that being the lack of post publication discussion and assessment. Below are a list of some of the main tools and sites offering some kind of comment, discussion or review system- it is not exhaustive or comprehensive, but it will give you some idea as to what they are and do.


PLOS ONE
PLOS ONE’s refers to its mission statement as; “Accelerating the publication of peer-reviewed research”. First and foremost PLOS ONE is an open access collection of journals that unlike many traditional journals has sped up up the publication process and ensures authors retain copyright. Not a post-publication review site outright, it does allow users to comment on the published research, very much how newspapers allow visitors to comment their news articles. Commenting on research is in essence less formal than post-publication reviewing, the reader comments and has the remit to post something as indepth as they wish. They may want to write just a few lines about a part of the research, the methodology, results or conclusion or a longer more in-depth review about the whole paper. When commenting on papers in PLOS ONE you must be a registered user and identify any competing interests. The rules are quite simple and say that anyone commenting on someone else’s research must not post content as stated below:
  1. Remarks that could be interpreted as allegations of misconduct
  2. Unsupported assertions or statements
  3. Inflammatory or insulting language
Anyone breaking these rules will be removed and their account disabled- obviously it does not stop them creating new accounts, but that will always be a problem for many interactive websites.
http://www.plosone.org/
PubMed Commons
PubMed is a huge publicly accessible search engine that accesses the Medline database of references and abstracts on life sciences and biomedical topics. It recently launched PubMed Commons to enable authors to share opinions and information about scientific publications in PubMed.


To be eligible to use PubMed Commons you have to be an author of a publication in the database, therefore preventing anyone from going in and leaving comments. The emails of eligible authors have been collected from the NIH and the Wellcome Trust and authors emails within PubMed and PubMed Central. You can also ask a colleague to invite you into the system.


Screenshot_2.png
© PubMed
The guidelines for PubMed Commons are more stringent than that of PLOS ONE and other such sites. Commenters must use their real name and again disclose any conflicts of interest. By contributing to Commons they grant other users a worldwide, royalty-free, non-exclusive, perpetual license under the Creative Commons Attribution 3.0 United States License. Again the usual rules of not posting inflammatory, offensive and spam comments apply. Full guidelines can be viewed here:


Open Review
Open Review is a new feature within the popular academic social network and research sharing platform ResearchGate. Open Review allows users to publish an open and transparent review of any paper they have read, worked with, or cited. ResearchGate say it is: “Designed to approach the evaluation of research in a different way, Open Review encourages scientists and researchers to focus on one key question: Is this research reproducible?” Users can discuss articles they click on, with a slant more towards asking questions about the publication, than commenting or reviewing it.




F1000 Research

Screenshot_4.png
© F1000


F1000 - standing for Faculty of 1000 - is made up of F1000 Prime, which is a personalised recommendation system for biomedical research articles from F1000. F1000 Research is an open science journal with post-publication peer reviewed research with underlying datasets. Finally there is F1000 Posters which is an open repository for conference posters and slide presentations.
F1000 Research has a system of open peer review which publishes referee responses and allows for replies by the authors and reader comments, so a bit of everything. In addition they offer incentives for reviewers which include a 50% discount on article processing charges for the 12 months following the submission of their referee report. They also offer a 6 month free personal subscription of their sister service F1000 Prime. Users of F1000 can track the conversation and even discuss the article at the bottom of the page, so the entire process, paper, review and discussion are taking place on one page. Even referee’s reports can be cited in F1000 Research and published under a CC BY license. A DOI (digital object identifier) is assigned to every referee report, so it can be cited independently from the article.




PubPeer
PubPeer refers to itself as the online journal club that allows users to search for papers via DOIs, PMIDs, arXiv IDs, keywords and authors amongst other options. PubPeer’s goal is to create an online community that uses the publication of scientific results as an opening for discussion among scientists. Researchers can comment on almost any scientific article published with a DOI or preprint in the arXiv. You can also browse the list of journals with comments, although presently it is rare to find a journal with more than a couple of comments. First and last authors of published articles are invited to post comments- I’m presuming the authors in between also get a say. Unlike some of the other tools mentioned, PubPeer allows for anonymous commenting, which could open the door for more trolling and abusive behaviour as reviewers feel an extra level of protection from what they say. -one researcher has filed a lawsuit over anonymous comments which they claim caused them to lose their job after accusations of misconduct in their research.




Publons
The primary aim of Publons is to help researchers get credit for peer review. Whilst writing for peer-reviewed journals have often been seen by many of handing their hard work over to someone else to benefit from financially, that being the publishers, at least there is the benefits of the author’s increased kudos, profile and knowledge-building and potential to gain promotion within their organisation. Peer reviewing can also have similar rewards with regards to the researcher’s CV and promotion prospects and that they get to see emerging research but the anonymous nature of much of it means less opportunity for profile building, yet it is no less part of the system that is the research publishing cycle. Publons sets to works with reviewers, publishers, universities, and funding agencies to turn peer review into a measurable research output. They collect peer review information from reviewers and from publishers, and use the data to create reviewer profiles with publisher-verified peer review contributions that researchers can add to their CV. Publons state that; “reviewers control how each review is displayed on their profile (blind, open, or published), and can add both pre-publication reviews they do for journals and post-publication reviews of any article.”
https://publons.com/

The Winnower
The Winnower is possibly the least academic looking post-publication platform of them all, but that should not put readers off; in fact it should have the opposite effect. An attractive site that sets its stall out on the homepage with the statement that; “The Winnower is founded on the principle that all ideas should be openly discussed, debated and archived. As with so many new academic tools and platforms it began life thanks to a PhD student, namely Joshua Nicholson from Virginia Tech. It provides an interesting new angle looking at research from both ends of the spectrum, that which has made a big impact and research that was retracted with its own ‘Grain’ and ‘Chaff’ page. The grain features publications with more than 1000 citations or a Altmetric score above 250. Whilst the chaff looks at papers that were pulled from publication and give authors an opportunity to talk about their research rather than just a ‘name and shame’ list. The Winnower is obviously still in its early stages due to the handful of reviews and publications, but not every post-publication review site can begin from the point of PubMed. It is one certainly worth keeping an eye on.

https://thewinnower.com/


JOVE
The Journal of Visualized Experiments has now been around for some time, almost a decade but it is only in the last couple of years it has really broken through and is now subscribed to by many university libraries. JOVE is a PubMed-indexed video journal with a mission to increase the productivity of scientific research. Although not at the forefront of JOVE’s priorities, they do allow for comments on the published research videos.




Peer J
Peer J is an open access peer-reviewed scientific journal that focused on publishing research in the biological and medical sciences. It received substantial backing of USD 950,000 from O'Reilly Media - which founder Tim O’Rielly is famous for popularising the term Web 2.0. It is part of the same publishing company that was co-founded by publisher Peter Binfield (formerly at PLOS ONE) and CEO Jason Hoyt (formerly at Mendeley), who obviously have a lot of experience in scholarly communications.


Screenshot_3.png
© CC BY Peer J


Peer J has a points system for authors and commentators as an incentive to publish and comment on research. Anyone who has ever argued that citations, H Indexes and such as Twitter and followers are just multi-levelled multiplayer games will get this. The points system are below:
  • Be an academic editor on a PeerJ article = 100 pts
  • Be an author on a published PeerJ article = 100 pts
  • Make your manuscript reviews public on a PeerJ article = 35 pts
  • Submit an "open review" as a reviewer on a PeerJ article = 35 pts
  • Be an author on a PeerJ PrePrint = 35 pts
  • Be an academic editor on a rejected PeerJ article without reviews = 35 pts
  • Have an answer on a question accepted = 15 pts
  • Have feedback deemed "very helpful" by an author of a PeerJ PrePrint = 15 pts
  • Receive an up vote for an answer = 10 pts
  • Receive an up vote for a question = 5 pts
  • Receive an up vote for feedback on a PeerJ PrePrint = 5 pts
  • Receive an up vote for reply to question or comment = 1 pt
  • Have first feedback approved in moderation on a PeerJ PrePrint = 1 pt


There are tables of the top authors and reviewers which can be filtered by topic area, publication date and those who have asked the most questions and given the most answers. The questions and answers aspect is a different angle to the commenting process as it does potentially open up further dialogue between authors and commentators. At present though there does not seem to be much activity in this area.


Peer J state: “Our annotation system goes beyond just answering questions or finding answers. Everyone from authors, editors, reviewers, and visitors to PeerJ are contributing in some way. Often, these are "hidden" contributions to the body of science that can go unrecognized. The points that we are starting to show on profile pages are just a light way to surface this participation.”


As for this points ranking system, it will appeal to some researchers, those with a competitive edge, but on the flipside will feel uncomfortable to others who do not want to see their work captured in numbers, and that applies to any kind of metric not just Peer J. Netherless, it is an interesting take on the publishing model and one that will continue to create interest and debate.


        
PaperCritic
One of the first proper research commenting tools, PaperCritic appears to have ceased business but is still worthy of a brief mention. Created using the Mendeley API, PaperCritic connected with a user’s Mendeley account and allowed them to comment on research hosted within Mendeley’s huge database of references. Their blog, Facebook and Twitter accounts all fell silent in 2012 leading me to believe that this was no longer running. The chances are that Mendeley will at some point create their own commenting and reviewing system, so still well worth the mention.




There seems to be some difference between the notion of reviewing, discussing and commenting, something Kent Anderson in The Scholarly Kitchen wrote about earlier this year. With Anderson summerising that; “Today’s commentators seem to have many axes to grind. Far too often, commentary forums degrade into polemical attacks with win or lose dynamics at their heart. The pursuit of knowledge and science isn’t the goal. Capitulation of one combatant to another is.” Anderson questioned the validity of comments being championed by publications and websites and that they could never be considered in the same light as peer-review.


There is a need for both as comments can be insightful and highlight or spot useful content for the original authors or other readers without going into indepth reviews. On the flipside they can be malicious, unfounded and just clog up the whole knowledge process if left un-moderated and anonymous. Peer-review may not be perfect, but as the social web becomes more useful as a platform for discussion and knowledge sharing, it makes sense that other options are explored, even if they are supplementary. This is growing case for Altmetrics, first seen as an alternative to the traditional measurement of citations and now argued as more of an alternative indicator, rather than measurement. The real problem is that like with many other technologies and platforms for communication we run the risk of not being able to see the wood for the trees. Post publication review platforms need to be explicit in their aims and explain that clearly to readers and reviewers. Like social media, it is unlikely that we will see every researcher using these unless they became standardised and part of the research cycle. It is an option, as with academic discussion lists, where the most insightful and on occasion barbed communications take place. Post-review commenting is happening right now and someone out there may have already commented on your research- whether you respond remains your choice.


Recommended reading:





Tuesday, 28 October 2014

The Information Resources Academic Development Group

Within the Information Resources Group here at ScHARR we have an academic development group, cunningly titled the Information Resources Academic Development Group (IRADG). In line with the other academic disciplinary groups within our section of Health Economics and Decision Science, we meet every other month to discuss issues relating to our personal and professional development. Due to the diversity of roles that we have, we alternate the topics of our sessions. We have occasional visitors to give us an insight into areas that we feel we need to know more about but the sessions are generally led by members of our team.

Our topics rotate between teaching and research. To give a flavour of what we do, in our last meeting we discussed our research plans for the next three years and how our applied project work, particularly around literature searching, could feed into more methodological research, such as the work undertaken by Ruth Wong on “Assessing searches in NICE Single TechnologyAppraisals: Practice and Checklist”. In IRADG we are keen to look more at a variety of areas, such as grey literature searching and diagnostic searching to name but two.

In our last teaching focussed meeting, we discussed our many and varied teaching commitments and also discussed opportunities for further development. Within our team, we have people who have undertaken the SheffieldTeaching Assistant, become Fellows of the Higher Education Academy, undertaken Certificates in Learning and Teaching, undertaken Masters level qualifications in teaching and our own Helen Buckley Woods is currently in the process of taking an EdD in Higher Education. Phew!

Within the group, we all take our development and progression seriously, and the IRADG provides an excellent opportunity to meet, discuss and plan how to take this forward. I have been chairing this group since its inception in 2009 and whilst I am on maternity leave I am handing over to Ruth Wong to continue the good work! 

Thursday, 23 October 2014

Workshop and presentation slides on Altmetrics from Internet Librarian International 2014

This week myself and Claire Beecroft made our yearly pilgrimage down to London and attend Internet Librarian International 2014. For me it was my fifth trip and Claire's fourth and as far as we can remember another year where we had given a joint talk.
We delivered two pieces of work, firstly a day long workshop run by myself and Cat Chimes from Altmetric.com with contributions from Claire and Dr. Ehsan Mohammadi from The University of Wolverhampton. The slides and abstract can be viewed below.

Whilst Claire attended to help facilitate the Monday workshop on Altmetrics and co-deliver a parallel session on the same topic on Wednesday, I stayed down for the whole conference.
Metrics was very much a large part of day two with talks on measuring what students want to how libraries communicate with their users. There was a strong theme of creating positive change, whether that be through Rachel Neaman's plenary on digital inclusion or new ways libraries can work with everything to 3D printing to gamification thrown into the melting pot.
You can view the entire programme here for more information. 
http://conferences.infotoday.com/documents/212/ILI2014_Programme.pdf

Whilst the 3,000 plus Tweets from the conference can be viewed here:here: http://eventifier.com/event/ili2014/

The conference ended with a session on the ILI2014 App developed over the course of the conference, which sadly I had to miss due to catching my train back home. I did however stay around long enough to find out I was one of the winners of the conference selfie competition along with Bryan Kelly from CETIS and Toun Oyelude from Kenneth Dike Library, University of Ibadan. The winning image (brace yourself, is below). We all won a box of Green and Blacks chocolates, so well worth the effort.



Here are the slides from our conference workshop - excluding Ehsan Mohammadi's at his request due to them being used at a future event.



Here are the slides from our conference presentation on Altmetrics



I also got to see the story of the Anonymous hackers group at the Royal Theatre in Sloane Square, which has been turned into a hilarious and at times scary musical. It was quite fitting to find myself chatting with a guy from the U.S. before the show, whose job it was to stop such groups gaining access to websites and databases for the likes of the U.S. government. I never really got to ask him what he thought of Anonymous and Lulzsec. I've posted a video of the show and cannot recommend it highly enough, even if you are not that interested in the Web, it has a real human interest angle to it and is very, very funny.







Monday, 20 October 2014

Nature journals - all titles now available via the University library

Last November the University Library conducted a survey for staff and students.  Many of our customers commented that although we had access to some Nature journals we did not subscribe to some important titles.
 
After negotiations with the publisher therefore we now have a subscription to the Nature Publishing Group (NPG) complete collection! This includes all ejournals published by Nature Publishing Group.
 
For a full list of titles and access to the Nature website, please see http://www.sheffield.ac.uk/library/cdfiles/nature.

Or to access individual titles, please search for them on StarPlus.

Image: Nature.com



Thanks very much to Anthea Tucker, Liaison Librarian (Faculty of Medicine, Dentistry and Health) at The University Library for the heads up.  You can read their blog here.

Monday, 13 October 2014

Can cartoons teach medical students to be better doctors?


Still from one of my animations using  GoAnimate! 

Around three years ago now, I took on a new teaching commitment, to deliver a "Masterclass ILA", (Inquiry Based Learning Activity), for our Year 3 and 4 students at the Medical School at University of Sheffield.

The aim of this short course was to help students develop the skills to have difficult conversations with patients about how decisions are made on which drugs should be funded on the NHS and why some drugs which can be life-extending are not considered "cost-effective".

Given the overall aim of the course , I needed to find a method of assessment that would enable the students to demonstrate to me that they had developed their communication skills and their knowledge of health economic decision-making. The obvious route to go down here would have been some kind of role-play scenario, as medical students are often assessed in this way, and it would be a familiar format to them. However, when attempting to book rooms for the course, it quickly became clear that a room with a suitable amount of space for them to act out their role-plays wasn't available, and also, the more I thought about it, the more I wanted the students to be able to watch themselves and assess their own performance as well as that of the fellow students.

After having seen a demonstration of free online animation tool (Extranormal.com), the germ of an idea began to form. I looked at various online animation tools, and eventually decided to use "GoAnimate"- A free online animation tool that looked relatively easy to use and had an overall style that I thought the students would enjoy.

Having spent a couple of days learning how to use "GoAnimate", I quickly realised that I wouldn't be able to ask the students to create their own animations, as the time they would need to invest in learning it was simply too great. So I decided instead to give the students each a question, posed to them by a patient, based on a real life scenario that we have been using throughout the course. They wrote a script for a consultation with the patient, during which they attempted to relay to the patient, in an empathetic style and using  lay terminology, the answer to their question. I would then produce an animated version of the consultation myself, and we would then watch these as a group in the final session of the course, and the students would be asked to comment both on their own work and that of their fellow students.

When I first broached this to the students, I will admit that they were rather taken aback, though intrigued. It took longer than I expected to create the animations (12 of them in all), but the final session was successful and fun. Understandably the students were rather amused by seeing themselves as animated characters, but also enjoyed watching the scrips being read by the patient and doctor
 characters, and from the feedback they gave it was clear that once they heard their scripts being spoken by the characters, they could see both the strengths and weaknesses in what they had written- which was exactly what I had hoped for.

Having spoken about this with various colleagues over the last couple of years, most people are intrigued but rather surprised, and perhaps a little concerned that there could be any place for 'cartoons' in the medical curriculum. I was delighted therefore to read a story on the BBC news website around a year ago now about a doctor who had used comic book techniques in their own teaching with medical students, to great success! You can read more about it here: http://www.bbc.co.uk/news/health-25112785

I've now run this course three times, and my animation is getting better... I also pair the students now, so there's only six animations to prepare, which makes it a little easier on me. The animations are uploaded to YouTube as "unlisted" videos, and the students are given the link to the video so they can watch it again later, show their friends, etc.

It did require a small investment of time, around two days in total, to master the animation tool, but it has been very rewarding to use animation in an assessment context, and I would recommend it to anybody who might otherwise use role-play techniques as a form of assessment. Of course you can also use animation to give feedback, and I did exactly this, as I thought it was only fair that if I was animating the students, I should animate myself. So I'll give myself the final word, by showing you my animated self, feeding back to most recent  group of students:



Posted by Claire