Saturday, December 13, 2014

How will Google be able to index the internet of the future?

Google is probably the smartest company in the world. More than 60 trillion pages indexed and the number grows every second. It is estimated that in 2019, 4 years from now, the internet will double in size every three days. Hard disk sizes don’t double in size every three days. They double every 2 years. Google can’t build more and more data centers every day to keep up with all this size. What will happen? I think that Google will be even more selective about which sites or blogs are indexed and which remains out of sight and from the index. It will be impossible, even more Google, to handle trillions or maybe quadrillions of pages with content. 

Google has its limits.

I think the internet of the future will be even more caothic than what it is today. The competition to get web traffic will go from extreme to insane. Most blogs will simply get no traffic. Human writers will be replaced by machine articles. I read about one company called Wordsmith that says their goal is to get one single visitor to each page they write. In 2014 they published 300 million unique pages of content, written by robots, in order to get 300 million visitors to their articles. In 2015 they want to write one billion unique pages.

Can you see how the future is going to be like on the internet and for search engines? Machines will be able to write interestinf stories based on all the numbers published across the internet. New statistic numbers? The robots are ready to interpret and express with words what those numbers mean.

Let’s wait and see what happens.

No comments:

Post a Comment

Advantages of Working on the Cloud

A funny trend I saw on Google Trends is that desktop search is becoming obsolete. Windows 8 almost didn't have a desktop search because ...