The search engines must be the brains of the Internet, as the browsers must be the heart of the Net. Without searching, the Internet would have not had any future.
I believe Yahoo deserves the award of Internet search pioneer. They put together a list of web pages by category. That was a start. The Net surfers had places to go. Yahoo was called a directory or portal. The portals are man made. The directories are still useful. But they have serious limitations. First, it's impossible to categorize all web pages, given the extraordinarily fast pace of web changes. There are billions of web pages, and counting! No group of human editors can keep up with the immensity of the Internet. Another serious limitations of the portals is the bias. The selection and inclusion in a directory are very subjective decisions. It was made much worse by the pay-for-inclusion models. Money has become a substitute for quality. That would kill the Internet, as in Kaput Internet!
Next, a huge search engine took over the Internet: Alta Vista. It was a powerful search engine, capable of indexing and listing millions of web pages. Alta Vista shed light on huge dark areas of the Net, invisible before the search engine came to life. Alta Vista was quantitatively a power. It was a poor performer when it came to the quality of the search results. I remember searches when Alta Vista listed one URL over and over again! I remember one web page listed more than twenty times. The listing included every modification of the page! The relevancy of the search was also very poor in the old Alta Vista. (By the way, Alta Vista is now part of Yahoo search and uses the same search technology: Inktomi.)
Then, a smaller search engine defined the concept of relevancy: Hotbot. I remember Hotbot getting all the awards of a notable computing publication, PC Magazine. It was still the pioneering era of searching on the Internet.
The Internet was taken by storm with the introduction of Google and its famous Beta. Google picked off where Hotbot left. The relevancy became the major focus of the search technology. Only the Google insiders know exactly the algorithm. It is assumed that the keystone is the back-linking. That is, if a web page (or its parent site) is listed on many other pages, that page must have merit. If it is good, other people refer other Net surfers to the same resource. Such concept does have a logical foundation.
One of my web pages ranks very high on the keyword deviation standard. Yet, it is ranked very lowly on the keyword standard deviation! Isn't it the very same concept? Sure is. In both situations, my page offers the most comprehensive treatise of the subject, including pertinent (free) software. The page is also naturally integrated in a web site of related materials.
The above anachronism is also caused by the scholastic syndrome. The search engines consider that the established educational outlets must be the best in treating a subject. Problem is, the schools tend to be overly conservative, even mummified. Most advancements in knowledge have occurred outside traditional institutions. Matter of fact, the mummified education makes impossible advancement in knowledge. As in that Pink Floyd song:
We don't need no education,
We don't need no thought control!
There are several problems with back linking, especially after many web authors "discovered" the Google algorithm. First, the backlinking can be the result of pay-for-inclusion. Again, money speaks, not the quality; therefore the relevancy can be a moot point. Second, more and more web authors exchange links. One thousand good friends but lousy web authors can beat, at any time, one genius web author. The back linking has seriously damaged the relevancy of Internet searching. I have stumbled upon miserable web pages with high ranking in all major search engines. Such pages of misery succeed to rank high because of keyword manipulation and back linking.
Further improvements of the moment
Heuristics: I wrote previously on a serious weakness of the search engines. For example, they treated "search" and "searches" as totally different concepts. I tried the search engines on keywords that my web site deals with. I noticed that lexicographic and lexicographical were treated as totally different concepts. Logically, such concepts should have been treated as being the same logical entity. Thus, heuristic (logical) grouping of concepts (keywords) should be an urgent priority. I have noticed, lately, that Google improved on that key paradigm.
Integration: the book paradigm. Apparently, the search engines favor pages with very short content. They favor one page with the keywords repeated many times, instead of the book paradigm. The book paradigm is a collection of related pages: several pages at the same site dealing with the keywords. More pages dedicated to the subject means a more thorough analysis of the respective topic. The book model could seriously impede spamming. Laziness is the creator of short web pages with the keywords repeated again, and again, without meaning at all. Writing several pages dedicated to the same topic and closely related topics indicates seriousness. It is a good indicator that the keywords are treated in a more thorough manner. The integration should be viewed only as part of the same web site. Otherwise, it would very easy for any sucker to write down a few lines of keywords and then offer hundreds of links to external web pages of high quality! Every web idiot would rank higher than geniuses that don't offer links to any external websites! It takes more quality effort to write a book than scribble a sketchy page. The effort is even higher if the book is accompanied by a CD (e.g. a site dedicated to software downloads). Complexity should be valued, not punished.
The future of search engines
I do not believe that the search engines have a long-lived future. Kind of like the portals (directories). The portals made a lot of noise for a couple of years. Not any longer every Internet service provider is a portal nowadays. In the near term, I foresee high quality search engines installed by just about every Internet service provider. It is not hard at all to create high quality search technology unbiased and relevant. Searching has long been a major function of databases. Also, the word processors rely heavily on searching. There is a lot of search knowledge out there and experience too.
Look at the search facility of my web site. I still don't call it a search engine. It only indexes my web pages. I agree close to 100% on the relevancy of searches of my web site. If such technology could be expanded to index the entire Internet, it could become a high quality Internet search engine. Perhaps some programmer of Internet searching decides to open source his/her script or program. Hundreds, if not thousands of Internet programmers could chip in and improve the original to a highest quality of web searching. I name that level:
The Super Search Engine of the Internet
At this point of Internet infancy - the year of grace 2006 - the three major Internet search engines look very much alike. Looks like Google, Yahoo, and MSN apply the same search algorithm to the same search index. It also looks like they are applying now my concept of book paradigm - to some extent. I can teach them even better tricks...
Read more related pages:
Doctor in Occult Science of Searching (OssD)