Index Copernicus Has No Value

Index Copernicus

IC Value?

Index Copernicus is a Poland-based company that offers several services related to scholarly publishing. One of their products is a journal rating system, and I am concerned about the methods they use for ranking journals and the high proportion of predatory journals in their database.

Looking at the IC Journals Master List 2012 one sees the beginning of the list sorted by “ICV” or Index Copernicus Value. The list is sorted with the highest-ranking journals at the top. The second highest scoring journal is Acta Myologica, with a score of 53.44. This journal beat out Science, Nature, and many other top science journals.

Looking at the information page for this journal at the Index Copernicus website, one sees many errors. First, there is a dead image link. The journal deals with myology, the study of muscles, but is misclassified by IC as a humanities journal.

The page shows the journal’s earlier scores from 2005-2009, increasing from 5.56 to 7.18. (I have observed that Index Copernicus values always go up, never down). Missing are the values for 2010-2011. The 2012 value is an astounding 53.44, and there’s no explanation why the value has increased so much. Also, the page mislabels Acta Myologica as an English Title (it’s really Latin).

Index Copernicus

Many errors

Numerous predatory journals have IC values, and they often display their IC value prominently on their websites. They do this because they want to attract article submissions, and the article processing charges they bring in. The value may look like an authentic metric to potential authors. It bears a resemblance to the impact factor, so naïve authors may accept it as such.

Index Copernicus

Predatory publishers brag about their IC values.

The IC Evaluation Methodology is here. It seems heavily weighted towards publishing practice rather than scientific impact.

I find the Index Copernicus value (IC Value) highly questionable. In fact, I’d say it’s a pretty worthless measure, especially given how many predatory publishers have established values, values that increase every year.

17 Responses to Index Copernicus Has No Value

  1. JATdS says:

    I could consider this index to be useful, if the following were true: 1) can this company PROVE how it checks every single aspect on their long list of criteria? 2) Do they rely on the publisher to provide all the meta-data to establish the score, or do they conduct their own independent analysis? The page that in fact lists the metrics that are assessed to get an ICV are actually more about superficial issues related more to marketing and the visual and functional or technical perfection rather than to basic scientific principles. Thus, the following aspects are not taken into consideration, but shuld be: 1) is there fair peer review? 2) is there a single or double-blind peer review? 3) is there editorial bias? 4) does the company use spam e-mails to advertise the journal or to attract editors? 5) are there ethics, retraction and duplication policies? None of these basic scholarly aspects are factored into the ICV, so in that sense, I agree with the blog post that this is more of pseudo-metric that reflects a marketing value rather than an academic value. I have not analyzed the ICV value for any journals listed there, but indeed, it does seem strange that a 2012 value for Acta Myologica has increased 10-fold in just a few years. This suggests that the ICV is being gamed as equally as the impact factor is gamed. Some of the negative points also make no sense to me, for exmaple, what is “unethical advertising”? Some advertising such as Google ads or even adverts on the side-bars of Elsevier’s sciencedirect.com are not intrusive, can actually be useful to scientists, and even though a little irritating, are a valid revenue stream for publishers, especially those that use the green publishing model that does not charge authors to publish. So, I agree that alot is suspect, strange, or even incorrect about the metrics used to calculate the ICV. The ICV should be ignored as much as the IF.

  2. Samir Hachani says:

    I think they use the word Copernicus because it is name of the publisher of Atmospheric Chemistry and Physics, an open access journal of the European Geosciences Union.They think it would give more credit to their so called impact.

    • Stan Ustymenko says:

      I’m pretty sure they use the word Copernicus because it is the name of famous Polish philosopher guy. Sometimes a cigar is just a cigar.
      The value of IC exists, just doesn’t travel well beyond Poland I guess. Polish academics have yearly reports to fill with publication lists. If the journal is not in WoS or Scopus, being in IC gives publications there a certain quantum of value for bean-counting purposes (in Poland, to lesser degree in nearby countries).

  3. Alex SL says:

    (I have observed that Index Copernicus values always go up, never down)

    If it is literally always that is a bad sign, but it has to be said that the tendency for all metrics appears to be upwards simply because people cite each other more and more every year, partly because they are told to. It is a rat race, see the slowly but steadily increasing impact factors of all journals across the board, with the exception of those few that suffer through an incompetent chief editor.

    Also, the page mislabels Acta Myologica as an English Title (it’s really Latin).

    Really Latin? The name of the journal maybe, but would the articles in it not be in English or something? Surely nobody publishes in Latin any more?

    This screenshot with “impact factor: IC value” is really sneaky…

    • dzetland says:

      I disagree. Indexing metrics can — and should — have winners and losers, to maintain backward relevance to earlier observations as well as provide a decent signal of progress (or failure).

      On a semi-related note, I wonder if IC is so corrupt that they will raise their “IF” for publishers who pay them…

      • Alex SL says:

        I think you misunderstand. I did not say that there should be no losers. More importantly, whether they should have losers is besides the point as it is a simple empirical observation that the Thompson Reuters IF goes up over the years.

        No, not for all journals, and even for the successful ones not uniformly every year. But the decent mid to high level journals in my field all have higher IF now than five years ago, probably simply because people cite each other more often. (Plus perhaps a bit of gaming the system by authors and editors alike.)

      • Alex SL says:

        It has just occurred to me that there is another important reason why the metrics go up: More journals are added to the list that did not previously have an impact factor. And of course only citations in journals that are examined by Thompson count for the IF (which is one of its major weaknesses).

        Example: Assume every paper in the Journal of Scholarship (JoS) gets cited 5x on average in the first two years after publication, and one of those citations is always in papers published in the journal Scholarly Letters (SL). In 2008, SL was not yet included in the IF list but all the other journals citing JoS articles were, so JoS gets an IF (2008) of 4.

        Then in, say, 2010 the editors of SL decided to get themselves an impact factor and thus now, in 2013, the citations of JoS articles in SL are counted by the people who calculate impact factors. Eh voila, JoS now has an IF (2013) of 5 although nothing about its quality or impact has changed; it was all due to a decision made by the editors of SL.

        So much for backward comparability.

  4. Jeroen Bosman says:

    No, they use the name because they are Polish just as the famous astronomer/mathematician.

    I still give them the benefit of the doubt, although they should become much more transparant (just as Thomsom Reuters should make Journal Impact Factor more transparant). Singling out a record with strange data is not a strong way to underpin the bold statement that is has no value, nor is the fact that it lists predatory journals. These facts should be the starting point of an inquiry, not the final proof that something is worthless. It is easy to find such strange questionable data in JCR as well, but we are not going to denounce the whole JCR just because of that.

    My main objection to Index Copernicus is the image with that woman ;-) It looks cheap and seems selected to render the site a credible scientific feel, but I suspect it will rather attract interest of people who like to judge things based on status instead of thinking for themselves. But even that image does not make me want to ditch Index Copernicus, yet…

  5. Shalini Mehta says:

    But what about their practice of asking publishers to BUY their lCV logo for showing off on their website? I think they should look for other alternative source of revenue.

  6. […] Oczywiście taki sposób stygmatyzowania wydawców może być odbierany przez środowisko naukowe jako coś dobrego lub złego (pojawia się wiele głosów, że Jeffrey Beall zachowuje się jak ostatni sprawiedliwy, który rozlicza świat ze zła). Beall stara się sygnalizować o wszelkich inicjatywach, które urągają – jego zdaniem – podstawowym standardom dobrych obyczajów w nauce. Na warsztat bierze nie tylko wydawców i czasopisma, ale również firmy oceniające periodyki (w zeszłym roku napisał artykuł, który może interesować polskiego czytelnika, pt. Index Copernicus Has No Value). […]

  7. Maciej says:

    Polish Ministry of Science and Higher Education has its own list of “trusted” and “recommended” journals. And its also full of predatory jouarnals. Thats why Poland is lagging behind in terms of scientific research.

  8. Aachenac says:

    As a Pole, I do have an insight into the workings of IC by browsing their webpage and reading many related blog posts in Polish:
    The reason the ICVs go only up, is simple: A significant part of the current ICV’s is something called “transfer constant”. What it means is, a journal gets extra points if it has been previously evaluated by IC. As this is an infinite loop, no wonder one gets only more and more points there.

    Important part of the ICV is being indexed by T-R (especially being admitted an IF). This of course skews the ranking, as the IF itself is only measured based on the citation in the journals indexed by T-R, so by no means does it reflect true value of a journal (which, in turn, is NO proxy whatsoever for a value of a given paper).

    We experienced a rather unpleasant spam late 2013/early 2014 from IC. They offered “hastened evaluation” of your journal, if you pay them some $150. If you add the fact, that their ranking serves as base for Polish perverted system for points value for publications (updated yearly; no sign the 2014 is to be issued yet, which means, Polish scientists have no means to know how good they publish in the eyes of Polish Ministry of Science and Higher Education until even full year post factum), the situation is dire. Or rather, it has been dire for years now.

    The IC is far from being perfect; the IC evaluation system based on it has one inadvertant disadvantage: a journal only needs to fulfill the FORMAL criteria to be evaluated. There are no means to actually check how these work in any given case, as the IC is based on the content (questionnaire) DECLARED by the editorial board. The formal criteria include things like international editorial board, abstracts in English, native-speaker copy edits, regularity of print, publication of reviewers’ list at least once a year… (https://pbn.nauka.gov.pl/help/pl/journals-and-journal-questionnaires/how-to-submit-the-journal-questionnaire-in-2013), again, all being DECLARED by the journal. Funny thing is, no lay person may submit a recommendation for evaluation, no scientist may do this, not even members of the Evaluation Committee themselves may do this. Only the journal/representatives are permitted to submit their journals for evaluation, regardless how good a journal we are talking about (again, a valid point only if your journal has no IF yet). The Evaluation Committee has no ground in law to punish/remove from the list for wrongly declared values if the formal criteria are met, so this is paradise for predatory publishers. Forget citations, forget open access, forget responsible publishing. THIS IS POLAND.

  9. Aachenac says:

    Also,
    Details of the spam e-mail from 2013, presented as “information”.
    http://ewaluacja.indexcopernicus.com/index-en.html

  10. Atai Rabby says:

    One interesting point I would like to add here: though Index Copernicus has high Page Rank (PR:7) and Alexa Rank(243389) but when I inspected it with Chrome SEO Extension “RDS Bar” I have found that they are practicing SEO trick to get fake PR (Check here: http://www.seomastering.com/fake-pagerank-checker.php), See RDS Bar here:- https://chrome.google.com/webstore/detail/rds-bar-seo-pagerank-dmoz/jlipcaflaocihnmlhnhcfombgmmfglho?hl=en-US
    [ When you inspect their link http://journals.indexcopernicus.com RDS will just return alexa as Fake!]
    For those who don’t know how to achieve a high PR using SEO trick – the website just point to a high PR website when visited by Google bot and google bot think that pointed website as the original site; its like black magic.

  11. […] Beall har skrivit kritiskt även om denna verksamhet. Med deras beräkningsmetod kan tydligen för de flesta okända Acta Myologica rankas före Science […]

  12. Aditi says:

    Reblogged this on Aditi Sahu and commented:
    Vital information for new authors…

Leave a Reply -- All comments are subject to moderation, including removal.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: