The MP3 crisis (crisis for the recording industry that is) has been a crisis waiting to happen, and more fool the recording industry for not being ready. Because it is really just mirroring a crisis that the publishing industry has been in for several years now, and is just a function of increased network bandwidth. In case you don’t know, network bandwidth follows a law just like computing power follows Moore’s law, only better, because it doubles (at constant cost) every 12 months (Moore’s law says that computing power doubles every 18 months). So now that the bandwidth has grown to allow it, the copying that started with text files a few years ago (try searching for the first line of The Hitchhiker’s Guide to the Galaxy, and see how many copies you find: I just found 22), has just extended to music files. And mark my words in not too many years the film and video industry will be having just the same crisis the recording industry is now having.
The publishing industry largely exists for one reason: infrastructure. They could afford an infrastructure that individuals couldn’t, and so individuals were willing to let the publishing industry print and distribute their work in exchange for a share (often a small share, I might say) of the profit. So the publishing industry is in a crisis because now all of a sudden individuals can afford infrastructure that allows them to publish.
Aha! say the publishers, but we have a certain value add: we guarantee quality! But I believe that (peer-) review is an artefact of the current publishing model, and not a reason for its continued existence: you can’t afford to publish everything that gets sent, so you need to filter some of it out.
However, that filter is not a perfect one. Pre-review has structural problems that have been analysed before. Let me just mention two: Zen and the Art of Motorcycle Maintenance by Richard Pirsig, which was a bestseller as soon as it came out, was rejected by one hundred and twenty one publishers before it was accepted. And if you want a depressing read about scientific peer-review, read J. Scott Armstrong, Research on Scientific Journals: Implications for Editors and Authors, Journal of Forecasting, Vol1, 83-104 (1982).
But even the scientific world doesn’t take peer review entirely seriously: academics are largely judged not by getting articles into press, but by how many other articles refer to their articles. And I suspect that in the future evaluation and quality control will happen autonomously in a similar fashion.
If you don’t know it, Google is a web search site that ranks pages in an interesting manner: the number of external pages that point to a page gives the page an importance rank. If a page is pointed to by important pages, its own importance increases correspondingly.
This turns out to be an exceedingly good way of evaluating search hits. Google returns the matching pages with the highest importance first, and they are so confident that the first page returned is the one you will want that they even offer the option of going to the first in the list instead of displaying the list.
Now, you might object that there needs to be some way to start off the cycle, that just posting something to the web will not guarantee that it will be found and evaluated, and therefore get a Google value. However, I am not so pessimistic. Just to give you one data point, I wrote an article about Dutch spelling, and just put it on my home page: I didn’t tell anyone that it was there. Now, when I use Google to search for “Dutch Spelling” (which by the way, wasn’t even in the title), out of 14,000 matches, it comes in at number 5.
I believe that in some not-too-far future, that this will be the way of evaluating quality of publications. In a sense, true peer review. And not only of textual publications, but music (and video) too.
© Copyright Steven Pemberton, Amsterdam, 2000. All rights reserved.
First published in ACM/Interactions, July 2000