[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: [tor-talk] trusting .onion services



On Sun, Feb 28, 2016 at 10:53:13PM +0100, Guido Witmond wrote:
> On 01/16/16 22:22, Rejo Zenger wrote:
> > Hi!
> > 
> > I'm wondering... 
> > 
> >  - How can a user reliably determine some .onion address actually
> >    belongs to intended owner?
> 
> Hi Rejo,
> 
> I think that in general, .onion addresses are unauthenticated. That is,
> there is no way of determining who an address belongs to.
> 
> All we know of an .onion address is that its tied to whomever holds the
> private key. And given the risk of disclosure of the private key, all
> bets are off. This is also true of GPG, where an adversary can create a
> clone key bearing my name but their key. Erinn Clark of the Tor Project
> has been a victim of such an attack.

But the whole point of GPG is that there is a web of trust. Yes anyone
can sign something and say that they're you. But only people who have
met you face-to-face and confirmed your key, e.g., using the
Zimmerman-Sassamanprotocol should be signing your key.  Someone can
then trust your key is bound to you to the extent that they trust the
keys of the people who vouch for you.

This is why we suggest this approach for an at-the-moment solution
https://github.com/saint/w2sp-2015/blob/master/SP_SPSI-2015-09-0170.R1_Syverson.pdf

(Note the final edited version, coming out soon in IEEE Security & Privacy
is a little different, but content is basically the same.)

Besides the PGP approach we present, we also give an X509 style
solution that requires some policy changes but will work with the
usual browser certificate semantics for TLS.

For another example of the PGP signature approach not mentioned in
the paper see
https://blog.patternsinthevoid.net/isis.txt

aloha,
Paul

> 
> In my pet project I'm using Hidden Services as a means for people to
> connect to each other. One person opens a HS, send the onion address to
> another in an encrypted message. The other connects to the service. Then
> BOTH people authenticate to the other with their already exchanged keys
> before their software lets the data flow commence. Knowledge of the
> onion address or even having a copy of the private key won't get the
> connection started.
> 
> In short, I built an authentication layer on top of hidden services.
> 
> That authentication layer uses PKI certificates and stuff to distribute
> public keys to each other. And ultimately, the same issue reappears:
> Whom am I talking to? And with risk of disclosure of the private key,
> all bets are off.
> 
> I believe this to be a fundamental property of cryptography. The eternal
> uncertainty of the identity of the other party. The more anonymous the
> key exchange the higher the uncertainty. In other words: the higher the
> need for secrecy and anonymity, the greater the uncertainty.
> 
> The answer you are looking for is to determine how much of a risk there
> is with plain onion asdresses, or what extra authentication and
> repudiation you need to build on top. And how much deanonymisation you
> are willing to accept.
> 
> I believe it's ultimately a design trade off.
> 
> 
> With regards, Guido Witmond.
> 
> 
> >  - How is the provider of .onion service supposed to deal with a lost or
> >    compromised private key, especially from the point of view from the
> >    user of this service? How does the user know a .onion-address has
> >    it's key revoke?
> > 
> > Let me explain...
> > 
> > 
> > One of the advantages of using a .onion address to identify the service
> > you are connecting to, is that you don't have to rely on a third party
> > as you would do in a system with Certificate Authorities. By relying on
> > the certificate signed by a trusted CA, the user can be sure the site he
> > is connecting to is actually belongs to a particular entity. With a
> > .onion address that is no longer needed since those address are
> > self-authenticating. Sounds good.
> > 
> > Now, the problem I have is that the user doesn't have a reliable way to
> > determine whether a given address actually belongs to the site he wants
> > to visit. As far as I can tell, Facebook has two solutions to this: it
> > mentions the correct address in presentations, blogs and press coverage
> > wherever it can and its TLS-certificate mentions both the .onion address
> > as well as it's regular address (as Subject Alt Names).
> > 
> > So, the first solution can't be done by everyone, not everyone has that
> > much coverage. The second solution is nice, but falls back to the CA
> > system. Ironic, isn't it? [1]
> > 
> > Or, to rephrase it: how can a user reliably determine the .onion address
> > for a given entity without relying on the flawed CA system and without
> > the entity having a lot of visibility?
> > 
> > 
> > Given the fact that the hostname is a derivate of the private key used
> > to encrypt the connection to that hostname, there is a bigger issue when
> > the private key is stolen or lost (or any other case where the key needs
> > to be replaced.)
> > 
> > When the key is lost (yes, shouldn't happen, but shit happens), the
> > hostname changes. There is no reliable way for a user to learn what the
> > new key, and therefor the hostname, is.
> > 
> > When the key is stolen (or compromised in any other way), the key should
> > be replaced. This may be even more problematic than the case where the
> > key is lost, which would render the site unreachable. When the key is
> > stolen, the key may be used by an perpetrator. The problem: there is no
> > way to tell the world that a particular key is compromised. [2] The
> > administrator is able to make the site accessible via a new key and new
> > hostname, but the attacker may keep running a modified copy of the site
> > using the stolen key.
> > 
> > 
> > 
> > [1] Ironic, as Roger's blog on this topic makes clear there are all
> > kinds of reasons why we do not want re-enforce this system, partly
> > because it is flawed, partly because it costs money, partly because it
> > undoes the anonymity that some hidden sites need, partly because...
> > 
> > https://blog.torproject.org/blog/facebook-hidden-services-and-https-certs
> > 
> > [2] OK. Not entirely true, maybe. It may be possible to include those
> > key in some listing of the directory authorities marking them as bad
> > nodes. This is a manual process.
> > 
> > 
> > 
> > 
> 
> 



> -- 
> tor-talk mailing list - tor-talk@xxxxxxxxxxxxxxxxxxxx
> To unsubscribe or change other settings go to
> https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-talk

-- 
tor-talk mailing list - tor-talk@xxxxxxxxxxxxxxxxxxxx
To unsubscribe or change other settings go to
https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-talk