How can we measure the speed of the Internet?
All of the above criteria measure the speed of delivery of information that has already been selected for delivery from A to B.
But it has been said that the power of the network is N squared, i.e. that its value is proportional to the number of potential senders (N) multiplied by the number of potential receivers (N).
For that network that we call the Internet, N is now a very large number, in the hundreds of millions. There is no way that one person is ever going to connect to all the other people on the Internet, which is what is implied by "N squared".
Sometimes on the Internet you think that you are very close to unlimited amounts of data, because all you need to do to retrieve it is type in a short URL of perhaps a few dozen characters. A URL seems short if you measure how long it takes to type it. But if you measured how long it would take to guess an unknown URL, you realise that a typical URL is very long. Even a URL consisting of a single English word is very long, in this sense, if we consider how long it would take if we didn't know what the word was and we had to try each word in the dictionary until we found the one we were looking for.
There are two standard methods for solving the long URL problem:
You can also mix these two methods together: you might search for links pages for a given topic, and then follow up recommended links. But none of these methods is guaranteed to find all documents that are of potential interest to you.
Next time you are web surfing, and you come across a new interesting web page, try and find out, if you can, how old that web page is. The page itself might give a "last modified date", or the "Page Info" option in your browser might tell you. If you can find out the date at which the page was published, think about whether the page would have been interesting to you if you had been informed of its existence as soon as it was published.
Suppose, for example, that an interesting web page that you find is one year old, and you would have been interested in it one year ago. Then there is a sense in which that web page has taken one year to travel from the web server it is posted on to the web browser that you are reading it on.
Now consider that there might be a larger set of similar pages that you have not read at all yet. If you have read only five per cent of the web pages in a certain set of web pages, and the average age of those in that set that you have read is one year, then the average delivery time for that type of web page is (assuming a simple extrapolation of your current rate of page discovery into the future) twenty years.
So sometimes the Internet is very, very slow.
There is an upside to these dismal figures. If the Internet really is that slow, then there remains an enormous potential for speeding it up. If new technologies can be invented that reduce twenty years down to a figure closer to the limits currently imposed by hardware capabilities, e.g. a few minutes, then the final effect on human thought and human achievement is potentially mind-boggling.