[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]
Re: [tor-bugs] #6180 [Ooni]: Detecting censorship in HTTP pages
#6180: Detecting censorship in HTTP pages
---------------------+------------------------------------------------------
Reporter: hellais | Owner: hellais
Type: task | Status: needs_review
Priority: normal | Milestone: Sponsor H: June 2012
Component: Ooni | Version:
Keywords: | Parent:
Points: | Actualpoints:
---------------------+------------------------------------------------------
Changes (by isis):
* status: new => needs_review
Comment:
If we are packaging Tor, then could we simply tell the OONI copy of Tor to
AllowDotExit notation and always direct it at the same reliable exit node
for a given portion of the test? Or, specify only those reliable nodes in
a given region? This does obviously increase fingerprintability and
potentially even an attack surface.
Another workaround to the geolocation problem might be limiting the test
to only request non-geolocalized pages, i.e. not google.com or
facebook.com or whatever. There are plenty of vanilla static webpages out
there, we'd just have to be sure that as OONI grows we don't hammer them,
and also not to pick ones that are so obscure that we're essentially
publishing to that server which users run OONI tests.
As for your last suggestion, we could use the HTTP fuzzy match algorithm
from the captive portal test, which is not yet ported to the new Twisted-
based framework, but is in it's old version in the captive portal branch
of my OONI repo on github. That algorithm could definitely use some
smarts, however.
--
Ticket URL: <https://trac.torproject.org/projects/tor/ticket/6180#comment:1>
Tor Bug Tracker & Wiki <https://trac.torproject.org/>
The Tor Project: anonymity online
_______________________________________________
tor-bugs mailing list
tor-bugs@xxxxxxxxxxxxxxxxxxxx
https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-bugs