Skip to Main Content

Open Access 2: Predatory Publishers

What are predatory publishers?

  • Publishers that exploit the model of author-pays OA publishing by charging publishing fee for publication but with low editorial and academic standards.
  • These predatory titles may mimick the names of legitimate journals.
  • These predatory titles may use fake Impact Factors to attract authors.
  • Fast publishing as long as authors pay the publishing fee.

The Rise of Predatory Publishing: A Timeline (Click to enlarge)
Generated using MyLens

Some real examples:

  • In 2005, paper containing duplicated seven-word phrases was accepted and the authors were asked for a publishing fee of $150. This shows the lack of peer-reviews in predatory journals. Click to view more
  • In 2016, an author who unknowingly submitted a manuscript to a predatory journal struggled with article withdrawal and the potential threat of duplicate publication. Click to view more
  • Between January and August 2013, journalist John Bohannon submitted a fake paper with obvious scientific flaws to 304 fee-charging open access publishers; 60% of those journals accepted the paper. This demonstrated the prevalence of deceptive publishers. Click to view more
  • In 2022, an undergraduate submitted his paper to a potential predatory journal and was asked for a withdrawal fee when he wants to withdraw his paper later on. Click to view more
  • In 2023, Zhejiang Gongshang University in China announced that it would no longer include articles published in Hindawi, MDPI, and Frontiers journals when evaluating researcher performance. Click to view more
  • Announced in 2023, Malaysia's Ministry of Education will no longer pay for article processing charges when researchers at institutions in the country published in journals run by three publishers - MDPI, Frontiers, and Hindawi. Click to view more

To avoid leaving a poor record on your CV and to protect your reputation throughout your academic career, you're advised NOT to publish with any predatory publishers. The following are some tips you may refer to:

Warning Signs

When submitting your papers, do pay attention to the following warning signs:

  • Accepting articles too quickly, usually without criticism or quality control
  • Short peer-review process that without constructive comments
  • Rapid publication
  • Actively sends unsolicited emails to invite article submission/participation in editorial boards/attending conferences
  • Presents a fake impact factor or metrics
  • Unreasonable large amount of special issues
  • High price to publish; unclear about publication fees even if you've looked for information
  • Unclear procedures for handling manuscripts and journal workflows
  • No withdrawal/retraction policy
  • The journal website contains grammar and spelling errors
  • ISSN cannot be verified in the ISSN Portal
  • Not retrievable DOI
Suggested actions

Listed here are some suggested actions to protect yourself from predatory publishers:

  • Always cross-check the sources of information - e.g. double-check with the indexing databases (Scopus, Web of Science, or DOAJ) or Journal Impact Factors platforms to verify information presented on the journal's website.
  • Stay vigilant when you receive email invitations on paper submission - legitimate journals seldom send out email invitations to authors on paper submission.
  • Scrutinize the journal's website and evaluate the overall appearance and design of the journal's website. Predatory journals often have poorly designed websites with grammatical errors, spelling mistakes, and low-quality graphics.
  • Look for clear and transparent information about the journal's editorial board; check if the listed scholars or experts have legitimate affiliations and are actively involved in reputable research.
  • Browse and evaluate the quality of previous publications to get a general idea of the journal's quality.

Note that there is no single, straightforward way to identify a predatory title. Fake journals can make their way into reputable databases, and sometimes journal websites can be hijacked by predatory publishers. It is therefore important for authors to stay alert and look at multiple factors when evaluating the integrity and practices of journals.

Common journal evaluation platforms

Listed here are some common journal evaluation platforms that may help in evaluating journals' performance:

Journal Citation Reports (JCR) is a resource provided by Clarivate that offers a comprehensive analysis of journal performance and citation metrics. It provides information and metrics based on the citations (data from Web of Science) received by journals and their articles. The most well-known metric provided by JCR is the Impact Factor (IF). The Impact Factor measures the average number of citations received per article published in a journal within a specific year. It is widely used as a measure of a journal's influence and prestige within its field.

Researchers can make use of JCR as a tool for evaluating journal quality and making informed publication decisions.

Scopus is an abstract and citation database provided by Elsevier. It covers a wide range of academic disciplines and includes a vast collection of scholarly literature, including research articles, conference papers, books, and more. Scopus offers several features that can help evaluate journals' performance, such as citation metrics for journals, including the total number of citations received by a journal, the average number of citations per article, and the h-index of the journal. There are also some other journal-level metrics, like the CiteScore, SJR (SCImago Journal Rank), and SNIP (Source Normalized Impact per Paper).

By utilizing the various features and metrics provided by Scopus, researchers can gain insights into the performance of journals which aids in making informed decisions about publication venues.

Web of Science is an abstract and citation database provided by Clarivate. It indexes a vast collection of scholarly literature, including research articles, conference proceedings, books, and more. Web of Science offers several features that can help evaluate journals' performance, such as citation metrics and Impact Factor (IF) metric for journals, journal rankings, highly cited papers associated with a journal, etc.

By utilizing the various features and metrics provided by Web of Science, researchers can gain insights into the performance of journals which aids in making informed decisions about publication venues.

Directory of Open Access Journals (DOAJ) is a community-curated online directory that indexes and provides access to high quality, open access, peer-reviewed journals. DOAJ has a strict set of listing criteria to prevent predatory journals.(*DOAJ lists of journals removed and added) Journals must undergo a rigorous evaluation process, which includes assessment of peer review practices, editorial policies, ethical standards, and overall academic quality. Inclusion in DOAJ indicates that a journal meets these criteria.

By utilizing DOAJ, researchers and evaluators can identify reputable open access journals and assess their performance based on quality criteria, transparency, and adherence to open access principles.

 

 

 

The Think.Check.Submit website is an international endeavor aimed at helping researchers identify trustworthy and reputable journals and publishers for publishing their research.

It provides researchers with a set of guidelines and tools to evaluate journals or publishers before submitting their manuscripts. It serves as a resource to assist researchers in making informed decisions and avoiding predatory or low-quality journals that may compromise the integrity and visibility of their work.

Blacklists on predatory titles

Listed here are some reputable blacklists that help researchers identify predatory publishers or problematic journals:

Predatory Journals is an organization made up of volunteer researchers who have been harmed by predatory publishers and want to help researchers identify trusted journals and publishers for their research. Its aim to educate researchers and students, promote integrity, and build trust in scientific research and publications.

Note that it is an anonymous-hosted website and you are advised to judge the published information and make sensible decision for yourself.

 

Early Warning Journal List (《国际期刊预警名单》) is a warning list issued by National Science Library, Chinese Academy of Sciences annually since 2020. Its purpose is to remind researchers to carefully select platforms for publishing their results and to prompt publishers to strengthen journal quality management.

Beall's List was a referenced online resource created by librarian Jeffrey Beall. It aimed to identify and list potentially predatory publishers and standalone journals that exhibited deceptive or exploitative practices in scholarly publishing. The list gained significant attention within the academic community as a tool for identifying potentially predatory publishers and journals. However, it was also subject to criticism and controversy, with concerns raised about the inclusion of legitimate journals and potential biases in the listing process.

In 2017, Jeffrey Beall removed the list from his website, citing personal and professional reasons. Since then, Beall's List has been discontinued, and researchers now rely on alternative resources and initiatives to identify and evaluate predatory publishing practices, such as the Cabell's Blacklist and the Directory of Open Access Journals (DOAJ).

Here is an archived version of the Beall's List retrieved from the cached copy on 15th January 2017 (maintained by an anonymous researcher): https://beallslist.net/


Publish and Perish? How to Spot a Predatory Publisher (by Office of Scholarly Communication, University of Cambridge