Depending on your subject area, there may be some obvious journals that you might submit to. However, as new journals are launched and others change in popularity and impact, it may be worth checking out the alternatives when you are thinking about submitting your next paper.
One of the deciding factors on which journal to publish in may be how likely your work is likely to be cited by others. You can check the previous citation patterns for journals by using some bibliometric indicators. Some of these are available for free (for Eigenfactor, CiteScore, Google Scholar) but others are accessed by subscriptions (Journal Citation Reports). Remember that previous journal performance may not reflect future citations and that even in journals with very high impact, some papers are never cited.
The metrics below should provide useful information about the journals in your shortlist to help choose between them.
The impact factor gives a measure of the frequency that the average article in a journal has been cited in a particular year. Access impact factors on the JCR database:
The impact factor gives a measure of the frequency that the average article in a journal has been cited in a particular year.
For more information on how the Journal Impact Factor is calculated, visit the training guides produced by Clarivate (link below).
The Scimago site allows you to view journal ranks by subject area. You can also filter for only open access journals if you would like to adopt an open research policy or if you need to do this to satisfy your funder's mandate.
You can use the Scopus Journal Analyzer tool to compare some important metrics such as CiteScore, SJR, SNIP, and the number of documents that are not cited, between the journals on your shortlist. The tool is available as part of the University of Reading's subscription to Scopus.
You can compare up to 10 different sources and then view the CiteScore, SJR and SNIP metrics for each journal as a graph or as a table. Looking at the number of papers that are never cited in each source can be interesting (use the % not cited tab) - it is best to submit to a journal that others read and cite regularly.
Look at the mix of item types in your selected journals using the %reviews tab. Some journals have a clear mandate to publish a lot of review articles and may have this in their title. However, some new and some less reputable journals may publish lots of review articles in order to artificially boost their citations and other metrics.
There is an option to export the data in chart or table form using the export button at the top of the page.
Example plot of SJR over time for a group of journals in the same subject area. The names of the journals have been redacted to preserve the confidentiality of the data.
The CiteScore metric was introduced by Elsevier in 2016. The CiteScore calculates the average number of citation received in a calendar year by the number of items published by the journal in the previous three years. All items are included in the denominator whereas some types of documents are excluded by the Journal Impact Factor. The timescale for the CiteScore is three years rather than two for the Journal Impact Factor.
The source normalised impact per paper (SNIP) value is calculated by the Leiden University's Centre for Science and Technology Studies (CWTS) and is based on Scopus data. The SNIP value measures the average citation impact of the publications of a journal and corrects for the differences in citation patterns between fields (not taken into account by the Journal Impact Factor). This makes comparisons between fields easier to make.
Google Scholar provides a journal ranking based on the number of citations that papers receive. The rankings are based on the h5-index and the h5-median. This is a free ranking that is generated automatically by Google. The citations may not be as well curated as in other metrics databases. The advantage of Google Scholar is that it will have data for journals that are not included in other databases.
The h5-index is the largest number h such that h articles published in the journal between 2012 and 2016 (for 2017) have at least h citations each. This means that, in the example below, for papers in the Journal of Vertebrate Paleontology, 25 papers received at least 25 citations between 2012 and 2016. If you look at the same value for Nature, 366 articles received at least 366 citations over this time period.
The h5-median for a publication is the median number of citations for the articles that make up its h5-index.
Rethinking impact factors: better ways to judge a journal, Wouters et al. (2019), Nature 569, 621-623. 10.1038/d41586-019-01643-3