Search engine challenges

The Next Google

I came across this article from a writer named DKB, called The Next Google. The article is about how some innovators, looking to create the next great search engine, are doing more than providing a list of links from an indexed database, that just provides a list of links. The first search engine discussed, Kagi, is different in that it allows toggle switches to activate (or deactivate) searches of videos, news sites, discussion boards, Wikipedia, and listicle-type articles. There is also a feature called Kagi Instant Answers, which is apparently driven by Kagi AI, helps you "find exactly what you are looking for". There is a lens filter that directs searches to academic domains (with a .edu TLD) or discussion sites, or non-commercial sites. The concept is still very much a work in progress, but the goal is to add some structure to search results. Then there's Neeva, which calls itself the Everything Engine. This project was started by a former Google worker, and a couple others. But the aim of Neeva was to be free of ads and to be private. Of course, these are of little value if the search results are crummy. In an interview with the founders, they describe how they determine authority by whether Reddit posts or (in the case of tech topics) Stack Exchange, point to the article in question. In medicine, this would be more challenging, and although one could design a system based on the journal impact, such as the SJR ranking, sometimes the most important data are so fresh that it only exists in abstract form (yet) until something more definitive is available. What determines impact and importance is often the opinion of thought leaders on the research result. How will that be indexed?

How can we identify quality items?

Other up-and-coming search engines are also discussed, each trying to improve on Google, promising better privacy and customization, emphasizing content from what is considered to be more "trusted" sources, but these have to play out before we know if these strategies are successful. But how to tease out what is important in a clinical context? I don't see how to search for and assign an abstract impact score based on opinion pieces obtained after major conferences. Maybe some enterprising search engine will be able to tag these opinion pieces with something like Google's Tag My Knowledge so that they show up during organic searches.

Getting notified of new developments

This developer wanted to be notified whenever there was a new release of a particular app. Much like how, as a physician, I might want to be notified that there was something new and important in terms of cancer management. It turns out that the solution for him was simple - it was built into GitHub. I wish there was something similar for medicine (esp. oncology).

Science is hard - or the Important stuff is getting more diluted with non-important stuff

And finally, an article that purports that Science (with a capital S) is getting harder in the sense that we're seeing less "big" discoveries, and instead, more little discoveries. We're not seeing as many research papers that describe Nobel prize-winning work, as before. As the literature field is populated with lesser impact articles, a decreasing percentage of papers end up being "most cited". Reasons for this are not given, but it is stated that scientists "all seem to have an increasing preference for the work of the past, relative to the present". Also, there is the "burden of knowledge" where new discoveries require new knowledge. If there is more chaff than wheat, we really need something that will help us home in on the quality research data.