[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

[freehaven-cvs] try to integrate the distributed trust stuff better



Update of /home/freehaven/cvsroot/doc/wupss04
In directory moria.mit.edu:/home2/arma/work/freehaven/doc/wupss04

Modified Files:
	usability.pdf usability.tex 
Log Message:
try to integrate the distributed trust stuff better


Index: usability.pdf
===================================================================
RCS file: /home/freehaven/cvsroot/doc/wupss04/usability.pdf,v
retrieving revision 1.1
retrieving revision 1.2
diff -u -d -r1.1 -r1.2
Binary files /tmp/cvscQRzHe and /tmp/cvsQDFJlq differ

Index: usability.tex
===================================================================
RCS file: /home/freehaven/cvsroot/doc/wupss04/usability.tex,v
retrieving revision 1.19
retrieving revision 1.20
diff -u -d -r1.19 -r1.20
--- usability.tex	1 Jan 2005 21:48:54 -0000	1.19
+++ usability.tex	2 Jan 2005 01:18:28 -0000	1.20
@@ -25,9 +25,9 @@
 example, fetch a web page or send an email) without revealing their
 communication partners.
 
-In this chapter we're going to focus on the \emph{network effects} of
-usability on privacy and security: usability is a factor as before, but the
-size of the user
+In this chapter we're going to focus on the \emph{network effects}
+of usability on privacy and security: usability is a factor as before,
+but the size of the user
 base also becomes a factor.  Further, in anonymizing systems, even if you
 were smart enough and had enough time to use every system
 perfectly, you would nevertheless be right to choose your system
@@ -109,6 +109,7 @@
 %  - Confusion about what's really happening.
 %  - Too easy to social-engineer users into abandoning.
 
+%\section{Security by distributed trust}
 \section{Usability is even more a security parameter for privacy}
 
 Usability affects security in systems that aim to protect data
@@ -119,8 +120,8 @@
 communicating with whom, which users are using which websites, and so on.
 These systems have a broad range of users, including ordinary citizens
 who want to avoid being profiled for targeted advertisements, corporations
-who don't want to reveal information to
-their competitors, and government intelligence agencies who need
+who don't want to reveal information to their competitors, and law
+enforcement and government intelligence agencies who need
 to do operations on the Internet without being noticed.
 
 Anonymity networks work by hiding users among users.  An eavesdropper might
@@ -141,28 +142,70 @@
   catch-phrase was first made popular in our context by the authors of the
   Crowds~\cite{crowds:tissec} anonymity network.}
 
-There is a catch, however.  For users to keep the same anonymity set, they
-need to act like each other.  If Alice's client acts completely unlike Bob's
-client, or if Alice's messages leave the system acting completely unlike
-Bob's, the attacker can use this information.  In the worst case, Alice's
-messages stand out entering and leaving the network, and the attacker
-can treat Alice and those like her as if they were on a separate network
-of their own.  But even if Alice's messages are only recognizable as
-they leave the network, an attacker can use this information to break
-exiting messages into ``messages from User1,'' ``messages from User2,''
-and so on, and can now
-get away with linking messages to their senders as groups, rather than trying
-to guess from individual messages.  Some of this {\it partitioning} is
-inevitable: if Alice speaks Arabic and Bob speaks Bulgarian, we can't force
-them both to learn English in order to mask each other.
+In a data confidentiality system like PGP, Alice and Bob can decide by
+themselves that they want to get security. As long as they both use the
+software properly, no third party can intercept the traffic and break
+their encryption. However, in the case of an anonymity system, Alice
+and Bob can't get anonymity by themselves: they need to participate in
+an infrastructure that coordinates users to provide cover for each other.
 
-What does this imply for usability?  More so than with encryption systems,
-users of anonymizing
-networks may need to choose their systems based on how usable others will
-find them, in order to get the protection of a larger anonymity set.
+No organization can build this infrastructure for itself. If a single
+corporation or government agency were to build a private network to
+protect its operations, any connections entering or leaving that network
+would be obviously linkable to the controlling organization. The members
+and operations of that agency would be easier, not harder, to distinguish.
 
+Thus, to provide anonymity to any of its users, the network must
+accept traffic from external users, so the various user groups can
+blend together.
 
-\section{Case study: Usability means users, users mean security}
+In practice, existing commercial anonymity solutions (like Anonymizer.com)
+are based on a set of single-hop proxies. In these systems, each user
+connects to a single proxy, which then relays the user's traffic. This
+provides only weak security, since a compromised proxy can trivially
+observe all of its users' actions, and an eavesdropper only needs to
+watch a single proxy to perform timing correlation attacks against all
+its users' traffic. Worse, all users need to trust the proxy company to
+have good security itself as well as to not reveal user activities.
+
+The solution is distributed trust: an infrastructure made up of many
+independently controlled proxies that work together to make sure no
+transaction's privacy relies on any single proxy. With distributed-trust
+anonymity networks like the ones discussed in this chapter, users build
+tunnels or \emph{circuits} through a series of servers. They encrypt their
+traffic in multiple layers of encryption, and each server removes a single
+layer of encryption.  No single server knows the entire path from the
+user to the user's chosen destination.  Therefore an attacker can't break
+the user's anonymity by compromising or eavesdropping on any one server.
+
+Despite their increased security, distributed-trust anonymity networks have
+their disadvantages.  Because traffic needs to be relayed through multiple
+servers, performance is often (but not always) worse.  Also, the software to
+implement a distributed-trust anonymity network is significantly more
+difficult to design and implement.
+
+Beyond these issues of the architecture and ownership of the network,
+however, there is another catch.  For users to keep the same anonymity
+set, they need to act like each other.  If Alice's client acts completely
+unlike Bob's client, or if Alice's messages leave the system acting
+completely unlike Bob's, the attacker can use this information.  In the
+worst case, Alice's messages stand out entering and leaving the network,
+and the attacker can treat Alice and those like her as if they were
+on a separate network of their own.  But even if Alice's messages are
+only recognizable as they leave the network, an attacker can use this
+information to break exiting messages into ``messages from User1,''
+``messages from User2,'' and so on, and can now get away with linking
+messages to their senders as groups, rather than trying to guess from
+individual messages.  Some of this {\it partitioning} is inevitable:
+if Alice speaks Arabic and Bob speaks Bulgarian, we can't force them
+both to learn English in order to mask each other.
+
+What does this imply for usability?  More so than with encryption systems,
+users of anonymizing networks may need to choose their systems based on
+how usable others will find them, in order to get the protection of a
+larger anonymity set.
+
+\section{Case study: usability means users, users mean security}
 
 We'll consider an example.  Practical anonymizing networks fall into two broad
 classes. {\it High-latency} networks like Mixminion or Mixmaster can resist
@@ -212,35 +255,6 @@
 what their expected attacker can do, the researchers still don't know
 what parameter values to recommend.
 
-\section{Case study: security versus simplicity}
-
-% This prose is kinda ugly.
-Readers familiar with existing commercial anonymity solutions may be
-surprised by the discussion of anonymity networks above, since most of the
-market (with the exception of Zero Knowledge's Systems' defunct Freedom
-offering~\cite{freedom21-security}) have been based on a set of single-hop
-proxies.  In
-these systems, a user connects to a single proxy, which then relays the
-user's traffic.  This has negative security implications, in that a single
-compromised proxy can trivially observe all of its users' actions; and in
-that an eavesdropper only needs to watch a single proxy to perform timing
-correlation attacks against all its users' traffic.
-
-With distributed-trust anonymity networks like Tor, JAP, Mixminion,
-Mixmaster, however, users direct their traffic through a series of servers,
-each of which removes a single layer of encryption, and none of which knows
-the entire path from the user to the user's chosen destination.  Because of
-this, an attacker can't break the user's anonymity by compromising or
-eavesdropping on only a single server.
-
-Despite their increased security, distributed-trust anonymity networks have
-their disadvantages.  Because traffic needs to be relayed through multiple
-servers, performance is often (but not always) worse.  Also, the software to
-implement a distributed-trust anonymity network is significantly more
-difficult to design and implement.
-
-% XXXX arma -- can you add some analysis here?
-
 \section{Case study: against options}
 
 Too often, designers faced with a security decision bow out, and instead

***********************************************************************
To unsubscribe, send an e-mail to majordomo@xxxxxxxx with
unsubscribe freehaven-cvs       in the body. http://freehaven.net/