Scott Stouffer offers a great read via the Search Engine Journal website about “Using Artificial Intelligence to Solve #SEO”. While it is pretty technical, it allows us a glimpse on the algorithmic scale on whiy SEO is so difficult and what can be done on the search engine level to resolve this.
Mr. Stouffer talks about Particle Swarm Optimization in relation to birds flocking which is an interesting concept.
According to Mr. Stouffer, one of the biggest reasons SEO is so difficult, “In fact, typical optimizations have to go through four layers: crawling, indexing, scoring, and finally the real-time query layer. Trying to correlate this way is fools gold.
“In fact, Google actually introduces a significant noise factor, similar to how the U.S. government introduced noise to its GPS constellation, so civilians would not be able to get military-grade accuracy. It’s called the real-time query layer. The query layer is currently acting as a major deterrent for SEO correlation tactics …
“…The query layer is the user’s view of what is going on, not the brand’s. Therefore, correlations found this way will very rarely mean causation. And this is assuming that you are using one tool to source and model your data. Typically, SEOs will use a number of data inputs for their modeling, which only increases this noise and decreases the chances of finding causation.”
Mr. Stouffer offers a solution that search engines can take in order to more closely match queries to results (you’ll have to read the article to find this nugget). This may well be the future of search, which will positively impact users, so it will be worthwhile for SEO’s to be familiar with these concepts.