This slideshow – A History of Google Algorithm Updates – reminds us of the ways Google search has changed over nearly 15 years. It’s quite stunning and we can only wonder – what next? The timeline was created by DPFOC, an online marketing agency offering SEO services.
Not sure of the intention of this piece – to show how search engines process links, or to tell people how to block the crawlers.
How Search Engines Process Links, Jenny Halasz, Search Engine Land (Apr 13)
Personalizing results is all about context – where you are, what you looked at before, and probably several other factors, which Google NOW is determined to identify. This is especially true for activity on the smartphone. And it’s not just Google – it’s in apps, in the fitbit you wear, Facebook, everything – as you will read in this article.
The Future Of Search Engines Is Context, Aaron Friedman, Search Engine Land (April 6)
Google explains in this patent a prototype for surfacing content from structured databases.
How Google May Index Deep Web Entities, by Bill Slawski, SEO by the Sea (April 5)
Bill Slawski presents this takeaway
If you’ve been looking for a connection between the SEO of web-page Crawling, and the use of Data from sources like Knowledge-bases, this paper describes such a connection – using data from a knowledge-base such as freebase to query the content of a deepweb database, such as an ecommerce site where content doesn’t surface to be crawled unless it is queried first.
Microsoft identifies entities and expands on them to improve Bing search results – or so it seems from this patent – “Query Expansion, Filtering and Ranking for Improved Semantic Search Results Utilizing Knowledge Graphs”.
How Bing May Expand Queries Based Upon Finding Entities Within Them, Bill Slawski, SEO by the Sea (April 3)
“The patent is telling us that it might provide improved search results by expanding queries using information about entities involved.”
Blekko, a modest-sized search engine with a distinctive approach to indexing curated sites, has been taken over by IBM Watson. Blekko’s volunteers identified quality sites (thereby keeping out spam) and classified them using #slashtags – and searchers used those tags for more exact searches. There was more to it – and it is likely the “more” that IBM Watson wanted for its work in cognitive computing.
“Blekko brings advanced Web-crawling, categorization and intelligent filtering technology. Its technology crawls the Web continually and gathers information from the most highly relevant and most credible Web pages. It uses classification techniques to create thousands of topical categories, making that data more useful and insightful.”
[From Data, Data Everywhere – Now a better way to understand it, Building a smarter planet, March 27]
Matt McGee at Search Engine Land gives a recap of Blekko’s short live, 2008 to the present. Goodbye Blekko: Search Engine Joins IBM’s Watson Team
Google had developed the means to identify and employ entity analysis at least as far back as 2013 as this posting by Bill Slawski shows. The purpose of the patent he describes is to “to provide a factual response to a query showing different aspects related to a ‘single conceptual entity.'”
Google’s Knowledge Cards by Bill Slawski, SEO by the SEA (Mar 18)
Knowledge cards assemble “name, description, image, facts and related searches.”
Fascinating examination of a patent by Google to determine facts about some topic from patterns on Web pages.
Google On Crawling The Web Of Data by Bill Slawski, SEO by the Sea (Feb 22)
This type of pattern-matching and extraction of facts is part of how Google uses the Web as a database of information. By extracting facts and storing them in a data repository, like Google’s knowledge graph, it makes those facts available as direct answers.
DARPA, in the US Department of Defense, has launched a new search engine named Memex, which is intended to expose the “dark” web of hidden content.
Darpa Is Developing a Search Engine for the Dark Web by Kim Zetter, Wired (Feb 10)
The search engine was described and demoed on 60 Minutes. Only 5 minutes, you can view it from CBS – DARPA: Nobody’s safe on the Internet. Mind – the objective is to help law enforcement track down crime – and to do so through data mining.
Developers haven’t given up on data visualization for the search interface. Etsimo in Finland is a new contender with its SciNet interface. It is available as a demo. This Techcrunch has a short video.
This Search Engine Wants More Human Input, TechCrunch (Feb 2)
The SciNet approach to the increasingly hard problem of effective search is to involve the human user more by having them steer the algorithmic results — by signaling multiple intents as the process progresses. This generates a dynamic and visible spectrum of results — depending on what they are looking for, or interested in — and allows them to selectively drill down into complex queries in an informed, and self-guided way. The basic idea being that human-steered results are better than algorithms alone.
The visual interface never seems to grab users. Will be interesting to see if this company succeeds.