Thursday, 17 May 2018

Andy Tattersall to deliver Keynote at the Business Librarians Association Conference

Information Specialist Andy Tattersall is one of the three keynotes at this year's BLA Conference taking place at Swansea University. The three day conference takes place from 27th-29th June with the theme 'Making Waves'. Andy will be delivering his keynote on the 28th with a talk titled; "Staying afloat in a sea of technological change". 

The other two keynote speakers are 
Michael Draper, Associate Professor in Law, Swansea University and Professor Sally Bradley, Academic Lead in Accreditation, Award and Recognition, HEA and Professor, Sheffield Hallam University.
The conference web page and booking details can be viewed here

Thursday, 19 April 2018

Andy Tattersall's talk on Altmetrics at The British Psychological Society Research Day

Andy Tattersall was invited to give a talk about Altmetrics at The British Psychological Society Research Day held at the impressive Senate House Library in March. The recording of the talk and his participation in the final panel discussion can be viewed below. the slides are also at the end of this blog post.

Tuesday, 10 April 2018

Many a true word is spoken in jest, part two: more social media content that mocks, self-ridicules, and brings a smile to academia

Image of Andy Tattersall
Andy Tattersall
Two years ago, Andy Tattersall highlighted those Twitter accounts that offered some light relief from the often all-too-serious world of academia. This 2018 instalment includes an account “sadly” overlooked last time, as well as moving beyond the Twittersphere to share some the best memes, videos, and more to provide sharp commentary on peer review, academic advisors, and altmetrics.

In April 2016 I wrote about the growing number of parody Twitter accounts that take the best and worst of academia and serve it up as a comedy dish. As the title suggests, many a true word is spoken in jest but we all know that just below the surface lie the real home truths of our industry. The problem, however, for many academics trying to be “witty”, is that they can fall flat on their face. I thought it would be good to visit some of the other tongue-in-cheek academic excursions that capture the weird and wonderful within academia.

When I wrote the first post it was solely focused on the Twitter community, and sadly neglected to include one of the scholarly Twitterati’s most vocal protagonists - @ScientistTrump. When my post went live I was flattered to receive a tweet from Donald Trump, PhD calling my piece “biased” as it had not included him - he even concluded his tweet with one of the real Donald Trump’s trademark sign-offs: “SAD”. Thankfully the Trump obsession with fake news was not yet in full flow, but I am sure the post and the LSE Impact Blog would have been labelled as such. Whoever is behind this great account - and it is the greatest scientific Twitter account - has expanded to a full website and a forthcoming web store. Not wanting to inflate that already fully blown narcissistic ego any more, but the tweets are that of a very stable genius and reflect the kind of communication that is typical of President Trump but with a scientific slant applied. For example, in December Donald Trump, PhD proudly reported:
His supporters will no doubt still be keen to see that wall built to ensure academic literature stays out of the public domain.

Given the daily communications coming out of the White House, it is not hard to satirise the 45th President of the United States. Putting an academic spin on The Donald is not so easy but Psychologist Matt Crawford made a good go of it with a fictional paper he published. The paper titled “A title for a really great piece of research, just the best, really” is full of classic Trump boasts, so much so that you will hear Donald’s voice inside your head as you read it.

Donald Trump’s tweets might make you feel outraged, but imagine how your social media stream would have looked with Hitler kicking and screaming across the web? Putting an academic slant on it, how would he have dealt with scientific peer review? Thankfully someone took that much-parodied scene in Hitler’s bunker from the film Downfall and re-subtitled it to show how Hitler would have responded to negative comments from the third reviewer. After a raging tirade, the Führer concedes that maybe he should just submit to one of those new “open access” journals.

Captioned images shared across the web, better known as memes, also offer much light-hearted humour that only those within academia will truly get. Some of the sharpest include the popular memes Boromir (Lord of the rings) and Willy Wonka alongside the tweets from Research Wahlberg and the Hey Girl. I like the library too blog.

A personal favourite comes from that most cosmic of sages, Yoda:

Some of the finest moments can be found by searching “academic meme”s on Google Image Search or Pinterest.

Every institution has professors who are dapper in their fashion choices and those who look like they have crawled out of a hedge before heading into work. Prof or Hobo tests your ability to spot the professors from the tramps. I was made aware of the quiz by a professor in reference to one of his peers who proudly wears his dishevelled look as a badge of honour, actively trying his best to look like he lost a fight with a bear. The site features ten images and for each you simply have to choose whether the man in question is a professor or a hobo. Just remember that looks can be deceiving.

Whilst we are on the topic of chairs, there are also the kind you sit on to conduct your research. In case you wondered what happened to them after they were retired from duty, they appear on the Sad Chairs of Academia blog. Before they are dispatched to that great office in the sky, they are captured for one last time for this most surreal of blogs. I’m waiting for the best images to be compiled into a 2019 calendar.

Metrics and social media are never are far away from academic discussion, and both are valuable tools in communicating and gauging interest in a piece of research. The two are combined perfectly to calculate the satirical Kardashian Index where a scientist’s citations are compared to followers on Twitter. Of course citations and Twitter followers are no true measure as to a researcher’s true worth, but those with a high Kardashian Index score could indicate popularity over productivity. We are eagerly awaiting the Kanye West Index.

For most publishing in the academic sphere, you will no doubt receive regular invites to write for predatory journals. Whilst this issue becomes increasingly problematic there are a few things you can do to tackle these charlatans whilst also having a bit of fun. One idea is to use the tool Re:Scam which is part of the New Zealand online safety website Netsafe. This tool bounces replies back to scam emailers and keeps them tied up with computer-generated emails. Whether this will work with those actually sending out the phishing messages will be hard to tell, but it’s certainly worth a try in case any are bots. If that fails you can do as I did (in my lunch break) after receiving several requests to publish in a dubious fisheries and agriculture journal. I sent them a PDF formatted manuscript with the word “fish” repeated 6000 times, with a few fishy references to Jacques Cousteau and Michael Fish thrown in too, in addition to a table of different fish. For some reason they did not accept. Nor did they ever contact me again. Funny that.

The article originally appeared on the LSE Impact of Social Sciences Blog

Wednesday, 28 March 2018

Calling Australia!

Members of the Information Resources team recently hosted an online course for librarians based in Australia.   Led by Anthea Sutton, the FOLIO programme has been delivering web-based CPD courses to library and information professionals for over a decade.

Recently, FOLIOz (see what we did there?) has been partnering with ALIA, the Australian Library and Information Association to offer bespoke training catering for the needs identified by its members.

For the latest course, on Evidence-Based Library and Information Practice (EBLIP for short), Anthea was joined by a small team including Andrew Booth, Helen Buckley Woods and Mark Clowes to design and deliver the course content (which included video lectures, readings and assessed course work); as well as facilitating the group discussion boards and hosting two live webinars (a particular challenge given the time difference between ourselves in the UK and our participants "down under").   We were also delighted to welcome Professor Alison Brettle (from Salford University) to deliver a guest lecture on the future of EBLIP.

The course attracted participants from a range of sectors, including education and public libraries as well as from health - all keen to apply an evidence-based approach to solving problems and achieving best practice in the settings of their different services.

As one delegate commented: "This course is right on point as far as the skills I need to develop so our unit can reach its goals."

If you are interested in discussing how FOLIO could help with the training needs of your library/information team, please get in touch with us at

Wednesday, 21 March 2018

What Can Tell Us About Policy Citations of Research? An Analysis of Data for Research Articles from the University of Sheffield

Image of Andy Tattersall
Andy Tattersall
Image of Chris Carroll
Chris Carroll              
Andy Tattersall (ScHARR Information Resources) and Dr Chris Carroll (ScHARR Health Economics and Decision Science) have published a new paper in Frontiers in Research Metrics and Analytics. The paper looked at published University of Sheffield research and what the data says about the impact of its research on national and international policy. The percentage of outputs with at least one policy mention compares favourably with previous studies, while huge variations were found between the time of publication and the time of the first policy citation. However, some problems with the quality of the data were identified, highlighting the need for careful scrutiny and corroboration.
Altmetrics offers all kinds of insights into how a piece of research has been communicated and cited. In 2014 added policy document tracking to its sources of attention, offering another valuable insight into how research outputs are used post-publication. At the University of Sheffield we thought it would be useful to explore the data for policy document citations to see what impact our work is having on national and international policy.

We analysed all published research from authors at the University of Sheffield indexed in the database; a total of 96,550 research outputs, of which we were able to identify 1,463 pieces of published research cited between one and 13 times in policy. This represented 0.65% of our research outputs. Of these 1,463 artefacts, 21 were cited in five or more policy documents, with the vast majority – 1,185 documents – having been cited just once. Our sample compared very well with previous studies by Haunschild and Bornmann, who looked at papers indexed in Web of Science and found 0.5% were cited in policy, and Bornmann, Haunschild and Marx, who found 1.2% of climate change research publications with at least one policy mention. From our sample we found 92 research articles cited in three or more policy documents. Of those 92 we found medicine, dentistry, and health had the greatest policy impact, followed by social science and pure science.

We also wanted to explore whether research published by the University of Sheffield had a limited time span between publication and policy citation. We looked at the time lag and found it ranged from just three months to 31 years. This highlighted a long tail of publications influencing policy, something we would have struggled to identify prior to without manual trawling. The earliest piece of research from our sample to be cited in policy was published in 1979 and took until 2010 before receiving its first policy citation. We manually checked the records as we found many pre-1979 publications to have been published much later, often this century. This is likely due to misreported data in the institutional dataset, giving a false date; highlighting the need to manually check such records for authenticity. The shortest time between research publication and policy citation was a mere three months: a paper published in November 2016 and first cited in National Institute for Health and Care Excellence (NICE) policy in January 2017.

The reports are only as good as the data they analyse and our research did uncover some errors. Looking at those 21 papers with more than five policy document citations, we found seven were not fit for inclusion. One such example was identified when we discovered research papers had been attributed to the University of Sheffield when the authors were not, in fact, affiliated to the university. As this data is sourced from our research publications system, we assume this was a mistake made by the author; this can happen when authors incorrectly accept as their own papers suggested to them by the system. While this was almost certainly a genuine error, and may have been rectified later, the system had not yet updated to take account of such corrections. Another of these papers was mistakenly attributed to an author who had no direct involvement in the paper but who was part of a related wider research project. Another of the publications was excluded due to it not, in fact, having actually been cited in the relevant policy document. One of the papers that was included belonged to an author not at Sheffield at the time of publication, but who has since joined the institution. This showed that’s regular updates were able to discover updated institutional information and realign authors with their current employer.

The two most cited papers came from our own department, the School of Health and Related Research (ScHARR), in the field of health economics. Only two of the 14 most cited publications were in a field other than health economics or pure economics, both of which were in environmental studies. In total, the 14 most cited research outputs were cited by 175 policy documents, but we identified 9% (16) of these as duplicates. Of those 175 citations we found that 61% (107) were national, i.e. from the UK, and 39% (68) were international, i.e. from countries other than the UK or from international bodies such as the United Nations or World Health Organization. continues to add further policy sources to its database to trawl for citations. As a result, it should follow that our sample of 1,463 research outputs will not only grow with more fresh policy citations, but as older research citations are identified through new policy sources of attention. This work also highlights the importance of research outputs having unique identifiers so they can be tracked through altmetric platforms; it is certain that more of our research will be cited in policy, but if no unique identifier is attached, especially to older outputs, it is unlikely the system will pick it up. is a very useful indicator of interest in and influence of research within global policy. Yet there are clearly problems with the quality of the data and how it is attributed to subsequent data. We found one third of our sample of the 21 most cited research outputs had been erroneously attributed to an institution or author. Whether this is representative of the whole dataset only further studies will find out. Therefore it is essential that any future explorations of research outputs and policy document citations be double-checked and not taken on face value.

This blog post is based on the authors’ article, “What Can Tell Us About Policy Citations of Research? An Analysis of Data for Research Articles from the University of Sheffield”, published in Frontiers in Research Metrics and Analytics (DOI: 10.3389/frma.2017.00009).

The blog post was originally written for the LSE Impact Blog and is licensed under a Creative Commons Attribution 3.0 Unported License unless otherwise stated. The original article appears here 
Creative Commons Licence

Monday, 8 January 2018

New research must be better reported, the future of society depends on it

File 20171220 4954 1k92h2l.jpg?ixlib=rb 1.1
Understanding how and why things happen can help people make sense of the world. Pexels
Andy Tattersall, University of Sheffield

Newspaper articles, TV appearances and radio slots are increasingly important ways for academics to communicate their research to wider audiences. Whether that be the latest health research findings or discoveries from the deepest, darkest parts of the universe.
In this way, the internet can also help to facilitate these channels of communication – as well as discussions between academics, funders and publishers, and citizen scientists and the general public.
Yet all too often research-led stories start with “researchers have found”, with little mention of their names, institution and who funded their work. And the problem is that by reporting new research in this way, it fails to break down the stereotypical image of an ivory tower. For all readers know these “researchers” might as well be wearing white lab coats with the word “boffin” on their name badges.

Rolling news

News is now a 24-hour operation. Rolling coverage of stories means journalists have their work cut out in maintaining this cycle. But that is no excuse for missing out important pieces of information that underpin a story.
Take for example a story relating to health research that has wide ranging societal impact. Supporting evidence, links and named academics help a story’s authenticity and credibility. And at a time when “fake news” is an increasingly sticky problem it becomes essential to link to the actual research and therefore the facts.

Accurate reporting, it’s not rocket science. Pexels

This is important, because research goes through a peer review process where experts in the same field of research critically assess the work before it can be published. This is similar to news stories that are edited to ensure they are of good quality – although this process takes far less time.

Accurate reporting

In academia there has been a huge move to make research openly available and therefore accessible for the whole of society. While research institutions are making great strides in public engagement and the wider understanding of science, media organisations still remain instrumental in that process.
And while it’s been claimed that the public are tired of experts, the impact they have on society – from building skyscrapers to keeping us alive – is undoubtedly fundamental to our existence.

Science and technology have changed the way we work, communicate, and view the world. Shutterstock

But poor or incomplete reporting undermines respect for experts by misrepresenting the research, especially by trivialising or sensationalising it. So while academics from various disciplines are often willing to talk to the media – either as an author or from an independent expert viewpoint – misreporting of research and particularly data (whether intentional or unintentional) has a negative effect.
Academics are then vilified as having something to hide or accused of making up their research, while members of the public are exposed to unnecessary anxiety and stress by inappropriate headlines and cherry picked statistics that are reported in a biased way.

The public good

Of course, not everyone will want to check the citations and research outputs – and not everyone has the critical skills to assess a piece of specialised academic writing. Yet there are lots of people who, given the opportunity, would be interested in reading more about a research topic.
Media coverage opens up a democratic debate, allows people to explore the works of an accomplished researcher and helps the public understanding of science. And in this way, fair and accurate reporting of research encourages academics to be willing to work with the media more regularly and build good working relationships.
The ConversationNot only that, but the proper and accurate communication of science is beneficial to the whole of society – from the government to its citizens. So in the age of “fake news” it is more important than ever to make sure that what’s being published is the truth, the whole truth and nothing but the truth.

Andy Tattersall, Information Specialist, University of Sheffield

This article was originally published on The Conversation. Read the original article.

Friday, 15 December 2017

Start Now and Make 2018 The Year of Hassle-Free Organisation

Image of Sheldon Korpet
Sheldon Korpet
Information Officer
I often want to try something new at work to see if I can improve on my previous efforts. However, the more routine (required) demands are always compete for my time too 😒 Sound familiar?

Whether you’re a super busy student or a new professional, keep reading to learn a new way to organise your work tasks and make focused progress 👍

You Can't Do Everything at Once

At the start of the year I ran a “goodie bag promo” project and as a result our small library is getting more footfall, more inquiries and more complex questions, which is really nice to see. However, it does mean further time constraints.

I got sick of 'To Do' lists   mine always looked messy or I lost them or had seven going at once so I inevitably forgot things they were meant to remind me to do! 🙅📝

One of the things that has enabled me to up my capacity, without forgetting any important things, was starting to use a Personal Kanban board.

What is Kanban?

I first heard of this method through my Business Management degree. It's a system which aims to keep tasks moving through the workflow and I’ve adapted it slightly to fit with the re-occurring, never-ending tasks.

This is All Fabulous But How Does It Work?

Tasks are assigned to four categories and as you progress you can move them closer to competition:
1 Could do/ should do – this is where you store ideas, work tasks assigned, upcoming projects or things you’re putting off. You haven’t started these yet but you might in the future.
2 Doing – These are your current tasks for the day/week. Do not put more than three things in here to stay productive! You could also assign yourself a deadline for these tasks.
3 Ongoing - This is where I store all those never ending tasks like asking people to renew their books. For a student this might be "Weekly reading for HAR679". I can move it in to “Doing” so I know what I’m focusing on and this is also an area to store projects which you've had to put on hold while waiting on a response from someone else.
4 Done – this is without a doubt the best bit on the board for me. When it’s blank it motivates me to work hard and complete something so I can start to fill it. When it’s full I can bask in my own glory 👸

For this project you're going to need a template and some small sticky notes

Personal Kanban in the Library

As you can see, I’ve gone for an A4 piece of paper with sticky notes but you could use larger paper or create a digital version in Trello or Padlet which would let you access it anywhere. However, I leave this at my enquiry desk  and I get some level of satisfaction from physically moving the post-its.

Either way, it’s a great method to track your progress and hold yourself accountable to get projects finished in good time; instead of taking on about ten things at once:

☑️ Stay focused 
☑️ Make progress 
☑️ Reducing the risk of non-completion

Having project ideas or tasks recorded in “Could do/should do” but not rushing in to them also has the added benefit of giving time for you to reflect. This might be on what would be the best way to go about them or helping you realise if it’s even necessary to spend your time on this.

You Can Do It

The great thing about this method is that it’s cheap and easy. There’s nothing worse than procrastinating and wasting time getting organised – you can make your own template in a few minutes or download the one I made here.

If you give it a shot, I’d love to know! Feel free to tweet me a photo or let me know if you found it useful @SheldonKorpet