Monday, November 15, 2010

Cite as you like - II

Staying on the topic of the previous post, I was thinking about how citations work in different fields of research.

The poor quality of citations in a field appears to be closely linked to how search-friendly the literature in that field is. If you have to expend as much effort into collecting all papers relevant to your work as what you would in actually doing the work, then it would be unlikely to find many papers with excellent referencing within them. The problem with Google being your main search tool is that you can never claim to have the definitive collection of literature on any field. For example, I've spent a good amount of time these past few months looking for literature on climate change in India. Every week I find something new on some obscure corner of the internet. What I do find usually lacks good references, the few occurring references are either self-citations or incestuous in nature, and end up containing more opinions and leaps of logic than real, hard science.

Who can blame them? No one really knows what the state of the art is, and the few who claim to are almost always liars and charlatans. 'Expert opinion' replaces good research. A collective big-picture takes a back seat to an individual's myopic world view.

As I've mentioned before, the bio and medical sciences have PubMed. The sciences and some of the social sciences have the ISI Web of Knowledge (not free), which is excellent for advanced searching, and also for checking out citations. Then there are also sites like Arxiv and Scifinder and others. Just to name a few.

India-specific research has nothing. India-centric social sciences, even less. Compared to funding research, the money required for developing a good search tool is negligible. Consider the idea that a new repository website is created for India-specific research. Let's suppose that all governmental and foreign funding agencies mandate that at least the abstract of all funded work should be put up on the search website. At least till it becomes a habit. We (hopefully) start building a one-stop shop for the field.

Following the train of thought is not hard: Increase the ease of access to your work > more people read it > more people cite it > your work has higher impact > more people want to sleep with you. Flawless logic. Augment this approach with funds and a system to mine the interwebs and pull up a good fraction of prior work done as well, and we're in business.

Google Scholar is not quite the scholar's Google. Something else can be, though.

PS. I did a lousy job of citing my own sources in the last post. Mea Culpa. Many thanks to KVM and Vatsa for sharing Lamire's posts on the peer review system and academic fraud on Google Reader.

PPS. I'm not very sure about how things work in engineering. I remember hearing things like a journal with Impact Factor of around 3 in chemical engineering is usually a great one to publish in. The good bioscience journals seem to have impact factors higher than 10... and while the research output by itself is far greater in the biosciences than in any field of engineering, I wonder if a poor search system and dubious conference submissions are a factor in poor citations and hence poor impact factors.

No comments: