predictions: What's the future of wireless?


fileunderFound in Home
Sort by: Oldest  •  Newest  •  User
admin
view 
avatar

With all the amazing new WiFi gadgets coming out like the Nokia N810, Apple iPhone, iPod Touch, and Archos 605 WiFi you may feel underwhelmed by your WiFi options. The infrastructure hasn't caught up to the hardware.

What do you think of the future of wireless? When will we have ubiquitous, affordable wireless broadband?

This post was edited by Erik on 10/22/2007 1:03 PM
moderator
view 
avatar
If you look at what the Freifunk guys are doing in German, and the way distributed networking is taking off (BitTorrent, Joost, &c.), I wouldn't be surprised if within a few years web browsing was much different.

To give a rough outline, imagine if all web content was not only distributed but aware of its location. If I wanted to grab a copy of the NYtimes website I would grab it from the nearest source. Everything you download would be come a cache for everyone else. Instead of taking 15 hops around the globe to get a PDF I might be able to take two hops and get it from a PC/mobile/laptop in my town. This would certainly speed up content delivery for popular websites and would almost certainly render the Digg/Slashdot effect meaningless (as, like with BitTorrent, the more popular a file is, the easier it is to find).

Another advantage of this system is that once the demand for fresh content declined all requests would route back to the server that originally hosted the content, so there would always be copy available and there would be very fast access when it was popular.

There would of course need to be protection built into the system to prevent against man-in-the middle attacks and faked content. But if the P2P networks can solve this problem, I think it can be applied on a wider scale to web-browsing.

One problem which I'm not sure how to solve is the issue of age. I remember hearing that the Wikipedia guys had a problem because their webservers were located in the US and their cache servers were in Europe (maybe it was the other way around). The reason they did this was to cut out the latency which appears when you send packets across the Atlantic. But having two copies of the same data meant it could become out of date. I can't remember how the issue was solved...

If I (A) post this message now, someone (B, who is in, say Australia) logging into this server will see it. Now say you (C) update this page with your content after B has downloaded the page with my post. A little while later D (who is in B's town) wants to grab this page. He sees that it's going to take 5 seconds to grab the page from the original server, but there is a "fresh" copy only 0.01 seconds away (in B's cache). Now, does he make the decision to grab an "old" (say 30 seconds) copy or the fresh copy. If the fresh copy is grabbed, well, that's no different from now. If the old copy is grabbed, then D has missed out on the post you (C) made.

I'm not sure how this can be solved, either we have stale content or dead webservers, neither is a good option.
This post was edited by tyrion on 10/22/2007 1:35 PM

Post Reply:

File Under: