[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]
Re: [tor-bugs] #23126 [Core Tor/Tor]: HSDirs should publish some count about new-style onion addresses
#23126: HSDirs should publish some count about new-style onion addresses
-------------------------------------------------+-------------------------
Reporter: arma | Owner: (none)
Type: enhancement | Status: new
Priority: Medium | Milestone: Tor:
| 0.3.2.x-final
Component: Core Tor/Tor | Version:
Severity: Normal | Resolution:
Keywords: prop224, tor-hs, prop224-extra, | Actual Points:
research, privcount |
Parent ID: | Points:
Reviewer: | Sponsor:
-------------------------------------------------+-------------------------
Changes (by teor):
* keywords: prop224, tor-hs, prop224-extra, research => prop224, tor-hs,
prop224-extra, research, privcount
Comment:
Replying to [comment:3 asn]:
> So a very very basic statistic here that would give us an idea of the
adoption of HSv3 services could be:
>
> a) When a time period completes, every relay '''publishes the number of
HSv3 blinded keys''' it saw during the previous time period in its extra-
info desc. Relays also add some laplace noise to obfuscate the original
number.
There are several bugs in the HS v2 laplace noise implementation, see
#23061 and children. In particular, we need to make sure we don't re-
implement bug #23414 for HS v3.
> Time periods start at 12:00 UTC and last 24 hours, so relays can publish
this statistic once per day.
>
> b) After we have received all descriptors containing stats from a
specific time period, we add all the ''unique blinded key counts''
together, and publish the aggregate count. We add everything together to
remove the laplace noise, and also to get a final graphable number.
Unfortunately, that final number is not the number of unique HSv3 services
since HSes publish their descriptor on multiple HSDirs under the same
blinded key. However this number is definitely related to the number of
unique HSes, and by noticing how this number moves over time, we can
certainly spot adoption rates of HSv3 services.
>
> This is a very basic stat that could help us here. Furthermore, we can
then deploy similar analysis to what we did for the unique v2 .onion
counts, to weed out the duplicate HSes so that we get a more accurate
number.
I think there are some bugs in the v2 analysis, see #23367.
For v3, here's the analysis and implementation I did for experimental
privcount:
https://github.com/privcount/privcount/blob/master/privcount/tools/compute_fractional_position_weights#L26
(I left out the HS v3 hash ring, because it needed extra crypto, and
imports of ed25519 ids in descriptors. I'll implement it in
https://github.com/privcount/privcount/issues/422 )
> And I guess we can use privcount etc. to get an even more accurate
number.
No, you can't use PrivCount to get unique totals. (Aaron is working on a
separate project that uniquely aggregates addresses, but the current
design takes too much CPU and RAM to be run daily on relays.)
But you can use privcount to get a safe, noisy total from the individual
relay counts. (Otherwise, to get a total you have to publish the number of
addresses seen at each HSDir, which is less safe.)
--
Ticket URL: <https://trac.torproject.org/projects/tor/ticket/23126#comment:4>
Tor Bug Tracker & Wiki <https://trac.torproject.org/>
The Tor Project: anonymity online
_______________________________________________
tor-bugs mailing list
tor-bugs@xxxxxxxxxxxxxxxxxxxx
https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-bugs