You know, all you people who keep asking the same questions over and
over again back-to-back in new threads for days on end could try
Googling first.. It might be just a tad quicker.
See:
https://www.torproject.org/projects/torbrowser/design/#fingerprinting-linkability
(which is result #3 for "tor browser fingerprinting" on startpage.com's
Google results).
tl;dr: We prevent read access to the HTML5 Canvas (which doubles as the
WebGL rendering surface, among other things) to prevent video card,
font, and other rendering differences from being extracted, hashed, and
fingerprinted. If you go to certain obnoxious websites (such as
https://github.com), you can see this defense in action.
We also run WebGL in "minimal mode" which disables disable video card
and driver-specific extensions, so that this information is not
available to JS.
Still, WebGL is still a huge beast with an unknown and previously
unexposed vulnrability surface, which is why we still leave it
click-to-play via NoScript.
Thus spake Andrew F (andrewfriedman101@xxxxxxxxx):
I don't believe that the Tor-button changes any of the variables that are
linked to the hardware. And that is the key.
What is the point of Tor if fingerprinting works.
On Thu, May 9, 2013 at 4:08 PM, SiNA Rabbani <sina@xxxxxxxxxx> wrote:
Tor Button provides certain protections already. That's why its important
to use Tor properly. Tor Browser Bundle is shipped with Tor Button
installed:
https://www.torproject.org/torbutton/
--SiNA
On May 9, 2013 8:56 AM, "Andrew F" <andrewfriedman101@xxxxxxxxx> wrote:
Some one in Tor-Dev said that finger printing of the system and video
card
in particular allows someone to be tracked as well as having a cookie on
there system.
That sound pretty serious to me. Anyone working on this issue?
Do we have any projects on obfuscating Finger print data?
Seems like it should be a top priority.
_______________________________________________
tor-talk mailing list
tor-talk@xxxxxxxxxxxxxxxxxxxx
https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-talk
_