[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: Broadband Reports: Tor Network Bogged Down by P2P



Roger Dingledine wrote:

On Mon, Oct 17, 2005 at 11:15:49PM -0400, Dustin Eward wrote:


It's time for Tor to expand, not regulate. And if expansions isn't possible, just let it suck! I can't imagine many fileshare people will cleave unto dial-up speeds with their broadband... Once they learn that it sucks to use Tor, they'll stop. We need knee-jerk decisions in this project like we need knee-jerk political actions...



I like this approach -- let Tor self-regulate itself because when it gets too overloaded, people will leave until it works again.

There's a potential problem with it, though, when different users place
different value on their time and on their security.

So one failure mode would be if the filesharers care more about being
anonymous than the web users. Then they'd just leave it running in the
background, and not mind that it takes 6 days to download a file rather
than 6 hours, because eventually it arrives.

The current Tor network clearly can't sustain very many of these people,
but if all the web users give up first, then "let it suck" means most
of our users go away.

It remains to be seen whether this will happen, of course, but we're
trying to come up with some useful technical approaches too. :)

--Roger




I see your point, but you miss something important. Most file-share people don't leave their computers on 24x7. When it takes this long to Download stuff, they usually NEVER get it. Most are also windows users. Their systems are flooded with spyware and general bad design; they just don't have that kind of uptime.

Even bandwidth regulation is a research concept that can have effects on other aspects of Tor. I'm not agaisnt the matter. The greatest scientific discoveries are followed by "Hey, that's weird, whoda thunk it?" They aren't planned.

Trying to gestapo Tor into submission is kinda defeating the concept, I think... But giving it some thought could possibly teach us all how our hidden services could be attacked. Which begs the question, obviously a hidden service pulls more pipe than a client; if there were no fileshare people. Couldn't you just put 10 computers on a linksys and al keep repeatedly going to that hidden service, and make exposure by traffic analization pretty easy? How would bandwidth limits affect hidden services?

I talk too much....