[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: Threats to anonymity set at and above the application layer; HTTP headers



Thus spake Anothony Georgeo (anogeorgeo@xxxxxxxxx):

> IMO a needed and important feature of any
> 'filtering/scrubbing' proxy appliction is some sort of
> 'on-the-fly' decryption>scrubbing>encryption scheme
> for ingress/egress HTTPS traffic.
> 
> [needlessly complicated stuff removed]

K.I.S.S.

This has all sorts of issues with certificate verification and so on.
Not to mention that I think that any sort of user configurable
scrubber is not going to be used effectively by more than 1% of the
population (if even that). Hell, I don't understand privoxy's
configuration to the degree I'd feel safe relying on it by itself
and I'm a programmer.

The only way to do this is via extensions to the browser. That way you
do not interfere with CRL/OCSP for true cert verification (which sadly
seems very broken in Firefox currently) and it makes it easy to switch
components on/off if something doesn't work because one of your
filters isn't quite right. And you get SSL for free, because your
extensions see the web data AFTER the browser has performed (optional)
rigorous checks to make sure the cert has not been revoked or
otherwise compromised/spoofed.

I really think that we desperately need an intelligent proxy selection
mechanism such as Eric Jung's FoxyProxy (so long as it properly
isolates cookies for each proxy and does the proxy filtering on a
per-tab basis as discussed previously). Combine this with NoScript,
Adblock, and a user agent switcher, and I really don't see any reason
for privoxy anymore (except to remove maybe a stray HTTP header here
and there, but since those aren't logged, that may not be needed).

It sucks that we lose browser independence with this mechanism, but
thems the breaks. They should all should be compatible with xpi
anyways ;)


-- 
Mike Perry
Mad Computer Scientist
fscked.org evil labs