Truth and lies in academic publishing: distrust in research highlights the importance of education in critical thinking skills

‘Fake news’ is threatening public discourse and, by extension, undermining trust in academic research. But the current atmosphere of suspicion highlights the need for critical thinking and research evaluation skills. Siân Harris shares some of the things she spoke about at a recent panel discussion on ‘academic publishing in the era of fake news’ at the London Book Fair in March.

This article was first published in the e-Journal of CILIP’s Knowledge & Information Management Group.

Research is vital for innovation, governance, healthcare – ensuring people have the food they need, that medicines are developed appropriately and that policies are scrutinized properly. But, if we can’t trust the method by which research is disseminated, then we don’t know that we can trust the research itself.

Politics and public discourse are facing the challenge of pictures and statistics being misused or even fabricated on social media, leaving people unsure what to trust and the boundaries blurred between ‘spin’ and ‘fake news’. It is also a problem for trust in science. Part of this comes from viral pieces about vaccinations or climate change being given the same weight on social media as peer-reviewed journal articles. But the problem goes deeper; not even everything that looks like a journal can be trusted.

I’m talking here about what I call ‘dubious journals’ (sometimes referred to as ‘predatory’). These ‘journals’ are set up to capitalize on opportunities to make money online, often (but not always) from charging authors to publish their articles, but without the proper checks and processes we expect from academic journals. They may make false claims about metrics or say that they do editing and peer review when they don’t, for example.

Such journals can mean that poor research, unchecked, is given credibility that it doesn’t deserve. In addition, genuine, high-quality research, from good researchers might be published in these journals; but because it is published in a dubious journal it can’t be trusted – and, by extension, the other research from that researcher may be mistrusted. Dubious publishing avenues take money out of the system and away from where it could be meaningfully used in disseminating research – a significant problem, especially where money is tight.

Many journals that probably shouldn’t be trusted can superficially look very convincing. They may mimic the design approaches, claims and even the text of well-known and highly regarded journals. In contrast, not all journals have big budgets for design and marketing so there are many genuine journals, set up by small groups of scientists, that can be wrongly labelled as dubious because of websites that are less technically advanced or have less sophisticated design. This can particularly lead to general mistrust of journals from the Global South, which are more likely to be small, scholar-led titles.

Evaluating journals is a skill that needs to be learned

The skills to evaluate journals take time to develop. These skills are easier to acquire if you have the advantages and privileges that come from access to a wide range of trustworthy journals; being surrounded by well-published more senior academics to provide guidance; familiarity with journal publishing processes; and a good command of scientific English.

Researchers in the Global South, and especially early-career researchers, can be particularly vulnerable to dubious publishing practices, both as readers and authors. This is personally damaging to them in their careers. It also reinforces the biases that are already in the global research communication system in favour of Western researchers, institutions and publications. And it can make more difficult the process of using trustworthy research information in policy making.

I work for INASP, an international development organization dedicated to supporting individuals and institutions to produce, share and use research and knowledge. We work with journal editors and researchers in the Global South and see first-hand the challenges of navigating the complex landscape of scholarly journals, as an author, reader or editor of a journal from a low- or lower-middle-income country.

Supporting journal evaluation

Recognizing these challenges, INASP is pleased to be part of the team that set up and runs the Think. Check. Submit.initiative to help researchers identify trusted journals for their research. Think. Check. Submit. is developing a range of tools and practical resources to educate researchers, promote integrity, and build trust in credible research and publications. These resources include a checklist that is now available in nearly 40 languages.

This checklist is neither a whitelist (which lists ‘safe’ journals) nor a blacklist (which lists ‘unsafe’ ones). There are strengths and weaknesses of both of those approaches and many find them attractive because they are easy to use. However, both types of list essentially delegate to someone else the task of journal evaluation, which fulfils a short-term need but is inevitably subjective and doesn’t build up researchers’ evaluation skills in the long term.

In addition, journal evaluation is often a grey area and subjective. The publishing practices are only part of the picture; there are other criteria such as relevance. Indeed, in a survey that Think. Check. Submit. ran towards the end of last year, respondents put both relevance and reputation higher than trustworthiness in terms of their priorities in selecting a journal.

The Think. Check. Submit. checklist takes researchers through the steps they need to evaluate journals themselves. And, in response to the survey findings, we are looking at bringing more explanations and educational resources into the platform. In addition, all the partners in the initiative have different resources to help provide more information, including materials, online courses and mentoring from INASP’s AuthorAID project (www.authoraid.info).

Guiding authors and educating editors

An important part of the Think. Check. Submit. checklist is that it avoids focusing on the cosmetic aspects of journals and instead guides researchers to think more about the robustness of the processes and the integrity of the journals.

This coincides well with the thinking behind Journal Publishing Practices and Standards (JPPS, www.journalquality.info), a framework developed by African Journals Online (AJOL) and INASP and launched in 2017. AJOL and the several Journals Online platforms originally established by INASP in south Asia and Central America together host more than 900 journals published in their countries or regions. AJOL and INASP recognized that, although the platforms have always had criteria for inclusion, authors and readers still lacked familiarity with the journals. At the same time, we saw researchers voluntarily running journals in their spare time and being unfamiliar with the publishing processes and standards that are so well known to big publishing companies.

In response, JPPS provides a detailed assessment of publishing processes of all the journals on these platforms. Journals are awarded one of six badges – one, two or three stars, new journal, inactive or ‘working towards’ – depending on how many key publishing criteria are met. These badges are displayed on the journal’s pages on the Journals Online platforms to guide readers. Crucially, however, each assessment also comes with a report for the journal editors. This means that the process provides an educational tool for journals looking to improve their publishing processes and therefore help the research they publish to be disseminated better.

And this is key; the goal of all genuine journals is to share research, which can ultimately be used to enrich knowledge and improve our health, our societies and our planet. Helping researchers, policymakers and others to distinguish trustworthy research dissemination channels and the good research within them – and to avoid cosmetic, historical or geographical biases – is vital to ensure that academic publishing rises above the threat of ‘fake news’.

Cover photo by Chris Dobson

Sian Harris
Siân Harris is a Communications Specialist at INASP

Comments are closed.