[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [tor-talk] Scaling Tor



On 8/19/14, isis <isis@xxxxxxxxxxxxxx> wrote:
> Mike Fikuart transcribed 4.8K bytes:
>> A question raised in Tor-Design (section 9) is, "if clients can no longer
>> have a complete picture of the network, how can they perform discovery
>> while preventing attackers from manipulating or exploiting gaps in their
>> knowledge?â.  If the network were to be considered to scale up to
>> significant number of all Internet users, could it be that the Directory
>> Authority(Ies) release (to Directory Caches and clients) a uniform, random
>> sample of relays/nodes from the FULL set of nodes, such that the
>> randomness of the path selection is still maintained.  The random
>> selection could be sampled on a per client basis with enough of a sample
>> as is currently downloaded (6000 relays).  What this means is that each
>> client (or possibly groupings of clients) is getting a different âviewâ of
>> the network, but there would need to be a scaling down from the full set
>> to the sample set at some point before the client.  Any thoughts on the
>> idea?
>
> This is an interesting idea. Variants using random walks through nodes
> which
> only know a random subset of other nodes have been proposed before, e.g.
> MorphMix. [0]
>
> However, it should be impossible to verify that a given sequence is, in
> fact,
> random, rather than being a sequence in seeded such a way that it is
> predictable, or an encrypted sequence, etc. The biggest concern with
> improving
> Tor's scalability via handing out random samples of nodes from the
> consensus
> would then be that malicious Directories (whether Authorties or simply
> mirrors) could collude to hand out predictable subsets of relays to
> some/all
> clients.

What is the churn-rate of nodes entering and exiting the network?
(This would need to be a rate as compared to total count.)

I'm wondering whether a git-like protocol might make sense,
where, in the context of one or more trusted centralised Directories,
these Directories could simply perform "signature verification" for
something like a "signed git tag", providing a "signed verification"
of the directory at specific points in time (say, each hour).

And then the bulk of directory data could 'simply' be P2P spread.

Sorry if this doesn't make sense - I have not read the papers you
linked yet.

Regards
Zenaan
-- 
tor-talk mailing list - tor-talk@xxxxxxxxxxxxxxxxxxxx
To unsubscribe or change other settings go to
https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-talk