[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]
Re: [tor-bugs] #8244 [Tor]: The HSDirs for a hidden service should not be predictable indefinitely into the future
#8244: The HSDirs for a hidden service should not be predictable indefinitely into
the future
-----------------------------+-----------------------------------
Reporter: arma | Owner:
Type: enhancement | Status: new
Priority: normal | Milestone: Tor: 0.2.5.x-final
Component: Tor | Version:
Resolution: | Keywords: tor-hs needs-proposal
Actual Points: | Parent ID:
Points: |
-----------------------------+-----------------------------------
Comment (by nickm):
I'm posting some comments I've had here from Ian Goldberg and Aniket Kate
in an email thread. I'll excerpt the technical stuff that's most relevant
to this ticket.
Nick
>Do you have any quick insights on the correct "let's all securely come up
with a random number" scheme? I'd imagine you know a good bit about the
field.
Ian:
>It very strongly depends on your network model. What can you assume
about connectivity, for example? What can you assume about reachability
of the participants? (Are the hosts up? Is the network up?) What is the
adversary model? (Are some of the participants controlled by the
adversary? How many?)
>
>Depending on the answers, it can be as simple as "in each timeslot,
everyone broadcasts a random string (which are XORed) and a hash of their
upcoming random string for the next timeslot", or as complicated as
Aniket's thesis.
Nick:
>Hm. I think that for Tor directory authorities, our model is something
like, "We assume that some fraction of them might be down; we want the
algorithm to still work even if some fraction of them are compromised; we
don't want compromised directories to be able to choose the output."
>
> In general, we tolerate situations where a rogue authority can make the
algorithm fail to produce a result, if we can point to which authority was
failing.
>
> Implementation complexity is a killer here, too.
Ian:
> I seem to remember that the hourly consensus voting protocol has a
couple of phases that execute synchronously, in ~5-minute windows just
before the hour. Can you remind me of those phases? How suspicious is it
if an authority participates in one phase, but not the last phase? Is it
acceptable for a malicious authority to be able to choose from one of two
outputs values: one if he participates honestly, and one if he drops out
at the last minute?
Nick:
> [''summary of algorithm phases deleted'']
>
>>How suspicious is it if an authority participates in one phase, but not
the last phase?
>
>Not horribly so if it happens once in a while. If it happened with any
regularity, it would probably be suspicious. [''though we'd probably
suspect a bug rather than malice'']]
>
>> Is it acceptable for a malicious authority to be able to choose from
one of two outputs values: one if he participates honestly, and one if he
drops out at the last minute?
>
>Hm. I don't think it's ideal, but it's certainly better than the status
quo. I'd like to do a little analysis to see how much this ability helps
or doesn't help an attacker's ability to do the various censorship attacks
we're trying to prevent here.
>
>It would be proportionally more troublesome if (say) three colluding
authorities could pick any one of eight different values. The same
analysis would be needed there.
>
>I assume you're thinking of a system where honest authorities do a
commitment in phase 1, reveal their secrets in some new phase 1C, and the
shared secret is just (say) the hash of all the revealed secrets?
Ian:
> That indeed would be the implementation-simple approach.
>
>There's also the possibility of just using the authorities' signatures on
the consensus *as* their values. But then you'd need to be using a unique
signature scheme (there's only one valid signature for any (key,message)
pair); is that the case now?
Nick:
> We're using RSA2048-PKCS1 signatures, which are deterministic. We're
hoping to add Ed25519, which is deterministic, but not verifiably so by a
third party.
(To be continued...)
--
Ticket URL: <https://trac.torproject.org/projects/tor/ticket/8244#comment:8>
Tor Bug Tracker & Wiki <https://trac.torproject.org/>
The Tor Project: anonymity online
_______________________________________________
tor-bugs mailing list
tor-bugs@xxxxxxxxxxxxxxxxxxxx
https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-bugs