Search This Blog

Thursday, 16 February 2017

Gone In 60 Seconds

Mark Clowes
Mark Clowes
Mark Clowes braves the tough crowd that is the HEDS Section Meeting to give a one-minute presentation about his work.

One of the stranger things about working in an academic school rather than a traditional library is the departmental meeting, where you can find yourself sitting alongside people whose jobs have very little in common with your own.

The ScHARR Information Resources team sit within the Health Economics and Decision Science section of ScHARR, surrounded by systematic reviewers, economists, modellers and statisticians.   To help these different professional groups understand each other better, section meetings begin with quickfire one-minute presentations from each group known as "Gone In 60 Seconds".

As a Nicolas Cage fan (I even watch his really bad films, and God knows there are plenty) - and, not least, because it was my turn - I agreed to take part. But what would be "gone" in those 60 seconds?  My career prospects?  My credibility with colleagues (if I ever had any)?  Would I be challenged for hesitation, repetition or deviation?

I enjoy giving presentations and don't usually get too nervous; but the audience for this one (and the timing, with the expiry date of my contract approaching) made me particularly keen to impress. 

I decided to give my first airing to a topic about which I hope to present at one or two conferences in the summer - using a text-mining and data visualisation app (VOS Viewer) to deal with a large number of references retrieved by a systematic review.  I chose this topic to demonstrate to colleagues that IR staff were continuously experimenting with new technology and ways of working, and - since it has the potential to influence the scope of future review projects - because it would have relevance to all the different groups in the room.  An added bonus was that I could display some pretty images of the "heat maps" produced by VOS Viewer on the screen, which would take the audience's eyes off me.

The short format required more preparation than usual - generally I don't like to work from a script, preferring to maintain a conversational tone and improvise around bullet points - but my initial attempts to do so on this topic ran significantly over time.

In the end, I realised I was going to have to write out what I wanted to say in full - initially using free writing with pen and paper, then gradually refining and paring it down until I could beat the kitchen timer countdown (this was one of those tasks I could only have done working at home - colleagues would think I had lost the plot walking around reciting the same presentation over and over again).

I didn't want it to be a dry, technical presentation (in any case, there wasn't enough time to explain in depth how the software worked) so instead came from the angle of "why is this useful?" - i.e. for dealing with a common problem of facing too many references to sift in the traditional way, but potentially too important to ignore.

On the day, I think it went pretty well - people seemed engaged with what I was saying, although a slight technical hitch with my slides meant that I didn't quite manage my closing sentence before I (5...) was (4....) ruthlessly (3...) cut (2...) off (1...)

Monday, 6 February 2017

The Systematic Review Toolbox

Image of Anthea Sutton
Anthea Sutton
Last week I was invited to demonstrate the Systematic Review Toolbox (SR Toolbox) at our in-house Systematic Reviews Issues and Updates Symposium (SYRIUS) at ScHARR (School of Health and Related Research) at The University of Sheffield.  The symposium provides an opportunity for researchers to get together and share updates of systematic review methodological work being undertaken in ScHARR, so it was a great opportunity to promote the SR Toolbox and the resources it contains.  The SR Toolbox slot on the symposium programme consisted of a short presentation introducing the toolbox with a potted history of its creation and development.  This was followed by a live demonstration, which included some tips on using the toolbox.  Here are a few of those tips:
1.  You can use “Quick Search” to search for more than just the tool name
You can use “Quick Search” to search for tools by name, but you can also use it to search the titles and descriptions of tools. For example, if you’re interested in finding tools for critical appraisal, type it into the “Quick Search” box and you will find tools that mention “critical appraisal” in either the title or description of the tool. However, be aware this is exactly what it says it is, a “Quick Search”, so if you want a comprehensive list of all the critical appraisal tools in the toolbox, be sure to use “Advanced Search”.

2.  In “Advanced Search” selecting more than one feature will search for the features using “AND”
When using “Advanced Search”, it is important to note that if you select more than one feature, the toolbox uses the Boolean Operator “AND” and will return tools that meet all the features you selected.

3.  If you want to browse all the tools in the toolbox…
For Software Tools, tick the “Any” box underneath where it says “Check ‘Any’ if not concerned about any specific features”.
For “Other Tools”, check all 4 “Find me” boxes.
4.  The toolbox provides references to tool-related journal articles where available
The tool records in the toolbox link to/reference journal articles where they are available.  This might be an article about the development of the tool or a review of using the tool by a systematic reviewer who’s tried it out. If you know of any articles relating to tools in the toolbox, please get in touch and we will update the tool record accordingly.
I concluded the session by discussing the “community-driven” aspect of the toolbox.  Systematic reviewers and tool developers are encouraged to submit tools to the toolbox via the “Add a New Tool” feature.  The remit of the toolbox is to catalogue both software and other types of tools/supporting mechanisms (such as checklists, guidelines and reporting standards).  So if you discover a new tool that meets these criteria, please share it with your systematic review peers and colleagues by submitting it to the toolbox, which will help to continue the development of this really useful resource.

This post originally appeared on the Systematic Reviews Toolbox website and has been reproduced with permission.

Friday, 3 February 2017

Disentangling the academic web: what might have been learnt from Discogs and IMDB

Image of Andy Tattersall
Andy Tattersall
In recent years there has been huge, rapid growth in the number of online platforms and tools made available to academics carrying out their research activities. However, for many, such choice can lead to decision fatigue or uncertainty as to what is most appropriate. Andy Tattersall reflects on the success of Discogs and IMDB and considers what problems a similar site dedicated to academic research might help to solve; from version control and unique identifiers to multiple, diverse research outputs and improved interactions with data.
Academia can always learn a lot from the rest of the world when it comes to working with the web. The project 101 Innovations in Scholarly Communications is a superb case study, highlighting the rapid growth in academic and associated web platforms. As a result there is an increasing problem for academics when they come to choose their platform or tool for carrying out their work on the web. Choice is good, but too much can lead to decision fatigue and anxiety over having to adapt to more and more new tools and make decisions as to their value. In the last decade various organisations, academics and start-ups have noticed gaps in the market and created tools and websites to help organise and communicate the work of academics. This is now arguably having the negative effect of researchers not knowing where to invest their time and energy in communicating, sharing and hosting their work, as no one can use every platform available. Even by linking many of them there are still issues around their maintenance and use.
In hindsight, academia could have learned from two successes of the internet era. Discogs and the Internet Movie Database (IMDB) are two of the most popular websites on the planet. Each is authoritative and seen as the ‘go to’ platforms for millions of users interested in music and film respectively. IMDB is ranked at #40 and Discogs at #799 in Alexa, a global internet ranking index of websites. IMDB was one of the original internet sites launched in 1990, with Discogs arriving a decade later in 2000. Whilst there are other similar websites, there are few that even come close to their user numbers and the huge amount of specialised content they host. By contrast, academia has tried desperately to place large swathes of information under umbrellas of knowledge, but it all feels a bit too much like herding cats.
Image credit: Tangled Weave by Gabriel. This work is licensed under a CC BY 2.0 license.
Academia has always made use of the web to have discussions, host research and institutional websites but has failed to control the number of newer platforms that promise to be an essential tool for academics. Over the last decade – and notably in the last five years – hundreds of tools that aim to enhance a researcher’s workflow, visibility and networks have been created. Many of these do indeed offer a service: Figshare hosts research outputs; Mendeley manages references; and tracks attention. They are all superb and offer something befitting academia in the 21st century. The problem for many academics is that they struggle to engage with these tools due to the overwhelming number to choose from. If you want to manage references, do you use Endnote, Mendeley, ZoteroRefMeReadCube or Paperpile? If you wish to extend your research network do you sign up for ResearchGateGoogle ScholarAcademia.eduPiirusLinkedIn or even Facebook? This is before we tap into the more niche academic social networks. Then there is the problem of visibility; how do you make sure your fellow academics, the media, fund holders or even members of the public can find you and your work? ORCiD obviously solves some of this, but it can be seen as a chore and another profile that needs configuring and connecting.
As research in the 21st century continues on its current trajectory towards openness and impact, and as scholarly communications develop, there will no doubt be yet more tools and platforms to deal with all that content and communication. If we think about making data accessible and reusable, post-publication open peer review, as well as making other research outputs available online, we may see a more tangled web than ever before.

What Discogs could teach us

Like so many of the post-Web 2.0 academic interactive platforms, content is driven by the users, those being academics and supporting professionals. Of course, a large number of formal research platforms have remained as they were, hosted by institutions, research bodies, funders and publishers. Yet more and more research outputs are being deposited elsewhere such as GitHub (which has a comparable internet ranking to IMDB), Figshare, Slideshare, ResearchGate and Google Drive, to give just a few examples.

How can we compare the research world with Discogs?

In my mind Discogs is not too dissimilar to the research world and all of its outputs. Listed below are some of the similarities between them. Those who have used Discogs will hopefully make the connection quicker than those who have not.
Image of a table comparing academia with Discogs
A comparison of academia and Discogs 
IMDB and Discogs can be searched in various different ways, all of which allow a person to drill deeper into an area of the database or move around serendipitously using the hyperlinks. So with Discogs you may know the title of a song but not the artist, or you may know what label it was released on. You may also be keen to track down a particular version of a release based on geographical, chronological or label data. The search functions of Discogs may not be as complex as a research database such as Medline, but for the typical Discogs user this is not essential.
Image of Discogs webpage
What are the big problems a Discogs or IMDB-type site could solve?
Version control
With growing interest in academic publishing platforms that capture the various stages of publishing research, there is a problem of ensuring those searching for that research find the version they really want. We have the final, peer reviewed, accepted and formatted version; the report the paper may have contributed to; the pre-print; the early draft; the research proposal; and the embryonic research idea. Research platforms such as ROI aim to capture as much of this research process as possible.
Unique identity
ORCiD is a great tool for aligning the work of one person to their true identity (especially so for early career researchers or academics who change their name mid-career, for example). You do not have to have the common surnames of Smith, Brown, Taylor or Jones to be mistaken for another researcher, less common-named academics also have this problem. If a researcher publishes using their middle initial and then without, it can create multiple identities in some databases and tying them all together is not always straightforward and can be time consuming. In Discogs, an artist or band is listed with all name variations collected under the most commonly used title. ORCiD allows this, but sadly the problem is already very extensive.
Additional research outputs
The mainstay of academic output is the journal paper but that is not the case for some areas of research. There are artistic performances, computer code, software, patents, datasets, posters, conference proceedings, and books, among others. Some stand alone, whilst there are increasing numbers of satellite outputs tied to one piece of research. For example, in Discogs we might think of the LP album as the definitive item and of the single, EP or digital release as outputs resulting from that. For research this may be the report or journal paper with attached outputs including a poster, dataset and conference presentation.
Interaction with the research data
Each of Discogs and IMDB allows users to interact with its huge database of information. Users can set up accounts, add music and films to their personal collections, leave reviews and contribute knowledge. To flip that into an academic context, that might mean users saving research artefacts to a reference management package, leaving open peer review comments and contributing their own insights and useful resources.
Such a platform would not operate in isolation, as there would still be a need for other connected web presences to exist. Social media, such as Twitter, to communicate to wider audiences; publication platforms to host all of the published research; tools to manage references and track scholarly attention. Other tools would also be needed to help conduct the research, analyse and present results and data, create infographics, take lab notes, collaborate on documents and create presentations. Then there is the issue of who would oversee such a huge database, manage it and ensure it is kept largely up to date. Of course with something similar to Discogs and IMDB anyone could enter and update the content, with proper accreditation, audit trail and moderation. Such a platform would have been accessible to funders, charities and public, with certain limitations on access to certain content. Hindsight is a wonderful thing and given how IMDB and Discogs have grown into such well-known and used platforms it is a shame that the same did not happen in academia to help create such a central hub of knowledge and activity.
Note: This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.
About the author
Andy Tattersall is an Information Specialist at the School of Health and Related Research (ScHARR) and writes, teaches and gives talks about digital academia, technology, scholarly communications, open research, web and information science, apps, altmetrics and social media. In particular, how these are applied to research, teaching, learning, knowledge management and collaboration. Andy is a member of the University of Sheffield’s Teaching Senate and a Senior Fellow of the Higher Education Academy. He was the person who sparked interest in running the first MOOCs at his institution in 2013. Andy is also Secretary for the Chartered Institute of Library and Information Professionals – Multi Media and Information Technology Committee. He has edited a book on Altmetrics for Facet Publishing which is aimed at researchers and librarians. He tweets @Andy_Tattersall and his ORCID ID is 0000-0002-2842-9576.
Creative Commons Licence
This work was originally published on the LSE Impact Blog and is licensed under a Creative Commons Attribution 3.0 Unported License unless otherwise stated.

Tuesday, 31 January 2017

Recent IR work - What Works Wellbeing Review on Housing and Wellbeing

Lead author Louise Preston, along with Anna Cantrell and Suzy Paisley, all from Information Resources, are authors on the recently published scoping review for the 'What Works Centre for Wellbeing'.

What Works Wellbeing logo from
What Works Wellbeing Logo

As part of the Communities Evidence Programme, this ScHARR team, also including Professor John Brazier and Tessa Peasgood, looked at the evidence linking housing with wellbeing. We used a rapid scoping review methodology to look at what had already been published linking housing with wellbeing.

Our report has now been published online here, with a helpful Policy Briefing available here.

Front cover of the housing and wellbeing policy briefing available at
Policy Briefing Title Page

Following on from this, the team, along with Duncan Chambers from the Public Health section and Mark Clowes from IR will be working on a systematic review, looking at the evidence for the impact of housing interventions on the wellbeing of individuals and communities from vulnerable groups.

Andy Tattersall Takes over as MmIT Chair

Image of Andy Tattersall
Andy Tattersall
Information Resources' Andy Tattersall has taken over as the Chair of the Cilip specialist interest Group MmIT.  MmIT aims to unite CILIP members engaged in, or interested in, multimedia information and technology developments in library and information science. The group is concerned with the organisation, delivery and exploration of information through modern media including graphic forms, video and web based applications. The Committee's remit is to support 1500 members in the group by running regular events, a yearly conference and a quarterly journal. Andy has written about taking over as Chair of the committee and what their plans are for 2017 and beyond.

I want to say what an honour it is for me to formally take over as the Chair of the Cilip specialist interest group for Multimedia and Information Technology (MmIT). Not only because of my passion in this area and the work this group does but also to follow in the steps of my esteemed committee colleague and friend Leo Appleton. I cannot stress enough how Leo has been a very important part of MmIT over the past decade and is now starting in an exciting new position as Director of Library Services at Goldsmiths. I want to publicly thank Leo for all of his energy and leadership in steering MmIT through many waters, sometimes choppy ones at that. He was a large reason for me joining the committee over six years ago and I am delighted that he is staying with us and taking on the challenge of taking our long running journal into uncharted open access territory. Leo steps down leaving MmIT entering arguably its most positive and exciting period since I joined the committee.

As I step into the chair’s role from that of secretary, it also means many changes are afoot within our
Cilip's MmIT Group
committee structure. Firstly Catherine Dhanjal has stepped down from the committee as our journal editor. Catherine brought a tremendous amount of skills and contacts to the committee and ensured the very smooth running of the MmIT Journal over many years. She will be sorely missed by all of us who have served on the committee with her and we wish her all the best in her own enterprises as she continues to run her own successful consultancy. With Catherine stepping down it left us with a decision as to the future of the journal. After much discussion we agreed that the journal should go fully open access on a quarterly basis. The content and quality of the journal will remain the same as it focuses on technology and libraries and will be edited by Leo.

I’m pleased to report that John Bottomley, who works for OCLC, will remain as our treasurer for the next year, giving us some degree of consistency on the committee. John has excelled in his honorary position and ensures that MmIT remains in a healthy financial state. Ruth Wilson from Edge Hill University has stepped into my old shoes as secretary, a role that I am certain will aid the smooth running of the committee and all of its ventures. I’m also very pleased that we have Nic Kerr from The University of Liverpool who has been invaluable in the smooth organising of our events, of which we plan many more over the next few years. Dia Mexi-Jones and Lizzie Sparrow continue to help guide and support the marketing and communication activities of the committee.
One of the best gifts Leo could give us before he stepped down as Chair was to be very proactive in recruiting new members to our committee. I am happy to report that we have five new committee members who bring together a superb collection of skills and insights that I am sure will drive MmIT forward. The addition of such experts in our field of work will no doubt make us more valuable to our 1500 members and everyone involved in the library and information sector. I’m pleased to announce the five new appointments.
Luke Burton - Digital Transformation Manager - Newcastle City Council
Antony Groves - Learning and Teaching Librarian - University of Sussex
Alison McNab - Academic Librarian - The University of Huddersfield
Virginia Power - Graduate Tutor - University of West England
Claire Nicholas-Walker - Electronic Resources Librarian - Lewisham Public Libraries

I have been aware of the work of some of the new committee members for some time and am very excited about the prospect of working with them to take MmIT to new audiences and deliver fresh ideas and content. In 2017 we will launch many new initiatives by the group that I am sure will be of interest to MmIT, Cilip members as well as librarians, information and knowledge professionals across the UK. These changes will include the aforementioned new journal model that everyone will be able to access without subscription. We will host our fifth national conference on the 14th September at The University of Sheffield on the topic of ‘Open’. We will be sharing details about conference submissions in the next few weeks on the theme that ranges from open libraries, research, education, spaces, data among other strands. We are also planning the delivery of yearly half day workshop events that we will host around the country, as well as a yearly webinar event. If any of this appeals to you then there are several ways you can keep up to date with MmIT. Firstly by joining the group as a member, either by selecting it as one of your special interest groups if you are a Cilip member. Or you can still join us for a yearly fee of £40 without being a Cilip member. You can follow our blog and Twitter accounts for regular updates.

I remember when I joined the committee in September 2010 and there was much discussion about whether the group should continue. Given it had begun a few decades earlier with an original remit pre-dating the web, the committee questioned their relevance today in a world that no longer worked in microfiche, video, CDs or talked about ‘multimedia’. Back then I wondered why such a question should be asked, as more than ever there was a need to understand the ever changing world of technology and media as a profession. I feel the committee does have a valuable remit, more important than ever given how technologies seep into every part of our personal and professionals lives. There are a growing number of technologies and websites we can leverage for our organisations and our professional development. Our committee’s aim will be to explore as many of these as we can and share what we find with you. We will look to working with external partners, experts, writers and speakers and help support the library and information community as we always have. Hopefully through a new model, new committee members and new opportunities to impart knowledge we will help support our community better than ever before.

You can find out more about the committee by going to the Cilip website

Tuesday, 20 December 2016

Teaching Junior Doctors the Benefits and Barriers of the Social Web - Presentation from Social Media for Learning in Higher Education Conference

Image of Andy Tattersall
Andy Tattersall
Last week I attended the second Social Media for Learning in Higher Education Conference held at Sheffield Hallam University. I'd been lucky in being accepted to speak at both conferences with last year's talk about my Research Hack videos. The conference is a lively and engaging event for anyone involved in teaching and learning with an interest in social media.

The day started with a key not, (not keynote) which was an introduction to the theme for this year's conference - The Empowered Learner?
Image of conference
Giving my presentation in my Christmas jumper
After that we were rushed off in several small teams to carry out our own hack projects to create digital resources to support digital learners. Thankfully we had been given the full run of the new and impressive Charles Street Building so quickly located a desk with a large screen in one of the many open learning spaces. Along with four other team members we set to work on creating an Adobe Spark video that explained how students and academics should seek for online information in a 'Post-Truth' world. We had just short of an hour but managed to create a three minute video that explained the problem of poor quality information and how to critically appraise and avoid it when learning and conducting research. I'm pleased to say that our group was one of the winners and as a result we won a gold chocolate medal each, which my daughter happily accepted to eat and share in the celebrations.

My presentation was one of six Thunderstorm sessions and was about my Masterclass ILA which I have run for the last couple of years for fourth year medical students. The abstract is below.

Image of sketch note by Deb Baff
Sketchnote of my talk @debbaff
The purpose of this presentation will be to showcase the teaching I deliver to 4th year medical students at The University of Sheffield. The series of five two hour classes called Masterclass ILAs (Inquiry Based Learning Activity) focus on the social and mobile web and how the students can gain a better understanding of it as junior doctors. The sessions are an opportunity for students to build upon their own experiences of social media in a personal and professional setting and how they can use these and other technologies to their advantage once they qualify as a medical professional. The sessions explore the problems large organisations such as the NHS have in staying up to date with such as social media and how they can negate potential problems they can cause. The feedback from running these sessions has so far been excellent and more are planned for later this year.

The short set of slides from my talk can be viewed below.

Image of Conference delegates
One of the winning teams - me on the right
Delegates were encouraged to attend and wear their best Christmas jumpers, which I gladly did. Sadly though my DJ Santa top didn't sway judges and I missed out on a second prize for the day, you win some and lose some. The conference was a great opportunity to see what colleagues from around the UK were doing with social media in Higher Education. I'm already looking forward to next year's conference as it seems this popular event can only go from strength to strength as more educators discover the benefits of using social media as part of their teaching.

Thursday, 8 December 2016

Identifying evidence for economic models - notes from a workshop

Mark Clowes
Mark Clowes
Mark Clowes recently attended one of ScHARR's short courses, a one-day workshop entitled "Identification and review of evidence to inform cost-effectiveness models".

I've only been working in health technology assessment for 18 months but the big difference from previous roles I've held is that I'm often working alongside colleagues from very different professional backgrounds; and I hoped that this course would help me to understand their work better and see where my contribution fits in.

The participants were from a wide variety of backgrounds: technology assessment centres, university, pharmaceutical companies and private consultancies.

Suzy Paisley explained how during her time at ScHARR she had progressed from searching for clearly focussed systematic review questions of clinical effectiveness to the infinitely more complicated universe of economic models.  Models typically involve a wide range of different parameters in an attempt to reflect the complexities of real life, and identifying evidence for them is therefore much less "black and white".  Rather than producing a single comprehensive search strategy to find all the evidence, a model is likely to draw on many different types of evidence from different perspectives; and while a systematic approach is still required, it would be impossible for an information specialist to find (or a modeller to use) ALL the evidence.  Instead, transparent judgments should be made about what is included or excluded; there is no perfect model, but a "good" model will be explicit about the choices and decisions and sources of evidence which have informed it.

Paul Tappenden gave us an introduction to modelling, beginning by quoting George Box's famous maxim that "all models are wrong... but some are useful".   He argued that any model should always begin with a conceptual stage, at which decisions are made about the disease logic model (considering the "natural history" of the disease and taking into account factors such as the likelihood of progression, different risk groups etc.); the service pathway model (the patient's journey through different stages of treatment - which may be subject to geographical variations) and the design-oriented model (what type of model will best address the decision problem?  This may also be influenced by the availability of evidence and the previous experience of the modeller).

A group exercise in which we attempted some conceptual modelling around a topic quickly made us realise the complexity involved as, in order to calculate whether a fictional drug was cost effective, we would need a wealth of information: not only the obvious (evidence of its clinical effectiveness and cost) but information on its possible adverse effects to be weighed up against quality of life studies of patients living with the condition; information on resource use (cost of administering comparator treatments / best supportive care) and mortality (indicating how many years it would be likely that the treatment would have to  be provided.

Some of these data are unlikely to be found in traditional trials, and so over lunchtime we were given worksheets to explore alternative sources (including disease registries, statistics and official publications, and our own ScHARR-HUD database for studies around health utilities).

The most challenging part of searching for evidence for economic models may be deciding when to stop.   How much evidence is enough, and how comprehensive is it necessary to be when you may need to conduct multiple miniature reviews to answer one main question?   I know from personal experience critiquing the searches run to inform manufacturers' economic models submitted for NICE appraisal how contentious this topic can be, but in a recent paper for the journal PharmacoEconomics, Suzy has attempted to define a "minimum requirement" for this type of search.

The final session of the day came from Prof. Eva Kaltenthaler, who heads the Technology Assessment Group at ScHARR.   Eva helped us understand how reviewers make judgements about which identified studies to include.   Frequently there is a tension between researchers' desire to be thorough and comprehensive in their coverage, and the needs of the decision makers who commission the review for the results to be delivered in a short time-frame.  Where this is the case, rapid review methods may be called for.   This might mean prioritising certain selection criteria over others, although which are deemed most important will depend on the context.  In some cases the geographical setting of retrieved studies may determine how relevant they are; in others the study type, or the cohort size.

Overall this was a useful and thought-provoking day, although for any librarians/information specialists who thought they had already mastered comprehensive searching, there was some "troublesome knowledge" to take on board.   As we work more closely alongside researchers we understand better that they don't want to be overwhelmed with mountains of evidence; they want to ensure all perspectives are covered but to avoid wasting time on studies which do not make any difference to the final decision.  How information specialists can best support this information need remains a challenging question.  Will the boundaries will become blurred between our role in finding information and that of reviewers in sifting and evaluating it?  Are those of us without a previous background in medicine, economics or statistics (and let's be honest, very few of us are knowledgeable about all three) able to acquire sufficient skills in those disciplines to succeed in these shifting roles?

*NEWSFLASH* This course will be running again on 23rd March 2017 - find out more / book a place or see other short courses available from ScHARR.

Read Suzy Paisley's PharmacoEconomics article (2016): "Identification of Evidence for Key Parameters in Decision-Analytic Models of Cost Effectiveness: A Description of Sources and a Recommended Minimum Search Requirement"