[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

[freehaven-cvs] minor touchups, plus point out a paragraph that"s wr...



Update of /home2/freehaven/cvsroot/doc/wupss04
In directory moria.mit.edu:/tmp/cvs-serv15793

Modified Files:
	usability.tex 
Log Message:
minor touchups, plus point out a paragraph that's wrong


Index: usability.tex
===================================================================
RCS file: /home2/freehaven/cvsroot/doc/wupss04/usability.tex,v
retrieving revision 1.4
retrieving revision 1.5
diff -u -d -r1.4 -r1.5
--- usability.tex	25 Oct 2004 18:59:23 -0000	1.4
+++ usability.tex	27 Oct 2004 00:40:19 -0000	1.5
@@ -19,10 +19,12 @@
 \maketitle
 \thispagestyle{empty}
 
+This chapter discusses ...
+
 While security software is the product of developers, the operation of
 software is a collaboration between developers and users.  It's not enough
 to develop software that is possible to use securely; software that
-isn't usable often suffers in its security as a result.
+is hard to use often suffers in its security as a result.
 
 For example, suppose there are two popular mail encryption programs:
 HeavyCrypto, which is more secure (when used correctly), and LightCrypto,
@@ -54,12 +56,13 @@
 can't or won't use it correctly, its ideal security properties are
 irrelevant.
 
-* How bad usability can thwart security
+\section{How bad usability can thwart security}
 
-Hard-to-use programs and protocols can hurt security in many ways:
+As we read in chapter [Angela's chapter],
+hard-to-use programs and protocols can hurt security in many ways:
 \begin{tightlist}
 \item Programs with {\it insecure modes of operation} are bound to be used
-  unknowningly in those modes.
+  unknowingly in those modes.
 \item {\it Off switches}, once disabled, are often never re-enabled.
 \item {\it Inconvenient} security is often abandoned in the name of
   day-to-day efficiency: people often write down difficult passwords to keep
@@ -80,13 +83,10 @@
     on.  Unless your computer has been compromised already.''}
 \end{tightlist}
 
-[[Brainstorm up a big list.  More possibilities include:
-  - Confusion about what's really happening.
-  - Too easy to social-engineer users into abandoning.
-  - XXXX
-]]
+%  - Confusion about what's really happening.
+%  - Too easy to social-engineer users into abandoning.
 
-* Usability is even more a security parameter when it comes to privacy
+\section{Usability is even more a security parameter when it comes to privacy}
 
 Usability is an important parameter in systems that aim to protect data
 confidentiality.  But when the goal is {\it privacy}, it can become even
@@ -131,23 +131,23 @@
 networks may need to choose their systems based on how usable others will
 find them, in order to get the protection of a larger anonymity set.
 
-* Case study: Usability means users, users mean security.
+\section{Case study: Usability means users, users mean security}
 
 We'll consider an example.  Practical anonymity networks fall into two broad
-classes. {\it High-latency} networks like Mixminion or Mixmaster can resist very
+classes. {\it High-latency} networks like Mixminion or Mixmaster can resist
 strong attackers who can watch the whole network and control a large part of
 the network infrastructure.  To prevent this ``global attacker'' from linking
 senders to recipients by correlating when messages enter and leave the
 system, high-latency networks introduce large delays into message delivery
 times, and are thus only suitable for applications like email and bulk data
 delivery---most users aren't willing to wait half an hour for their web pages
-to load.  {\it Low-latency} networks like Tor or XXX, on the other hand, are
+to load.  {\it Low-latency} networks like Tor, on the other hand, are
 fast enough for web browsing, secure shell, and other interactive
 applications, but have a weaker threat model: an attacker who watches or
 controls both ends of a communication can trivially correlate message timing
 and link the communicating parties.
 
-Clearly, uses who need to resist strong adversaries need to choose
+Clearly, users who need to resist strong adversaries need to choose
 high-latency networks or nothing at all, and users who need to anonymize
 interactive applications need to choose low-latency networks or nothing at
 all.  But what should flexible users choose?  Against an unknown threat
@@ -158,9 +158,9 @@
 we'll prefer the high-latency network, and if the attacker is weak, then the
 extra protection doesn't hurt.
 
-But suppose that, because of the inconvenience of the high-latency network,
-it gets very few actual users---so few, in fact, that its maximum anonymity
-set it too small for our needs.\footnote{This is
+But since many users might find the high-latency network inconvenient,
+suppose that it gets very few actual users---so few, in fact, that its
+maximum anonymity set it too small for our needs.\footnote{This is
   hypothetical, but not wholly unreasonable.  The most popular high-latency
   network, FOO, has approximately BAR users, whereas the most popular
   commercial low-latency anonymity system, BAZ, advertises QUUX users.}
@@ -169,7 +169,7 @@
 low-latency system can give us enough protection against at least {\it some}
 adversaries.
 
-* Case study: against options
+\section{Case study: against options}
 
 Too often, designers faced with a security decision bow out, and instead
 leave the choice as an option: protocol designers leave implementors to
@@ -182,16 +182,28 @@
   them.  If the protocol designer can't decide whether AES is better than
   Twofish, how is the end user supposed to pick?
 \item Options make code harder to audit by increasing the volume of code, by
-  insreading the number of possible configurations {\it exponentially}, and
+  increasing the number of possible configurations {\it exponentially}, and
   by guaranteeing that non-default configurations will receive very little
-  testing in the field.
+  testing in the field. If AES is the default, even with several
+  independent implementations, how long will it take to notice that the
+  Twofish implementation is wrong?
 \end{tightlist}
 
-% XXXX This next graf is to dry and verbose; make it *fun*.
-Most users stay with default configurations as long as they work, and only
-reconfigure their software as necessary to make it usable.  This often leads
-to security choices being made by those least qualified to make them, while
-giving application developers a false sense of rectitude.  For example,
+% XXXX This next graf is too dry and verbose; make it *fun*.
+Most users stay with default configurations as long as they work,
+and only reconfigure their software as necessary to make it usable.
+%This approach often leads
+%to security choices being made by those least qualified to make them, while
+%giving application developers a false sense of rectitude.
+%% This statement is false and it'll get us hammered. Security isn't
+%% the main goal, *doing stuff* is. So choosing to do stuff rather than
+%% have security is not a bad thing. The above statement is what this
+%% book is trying to dispel. What are we actually trying to say in this
+%% paragraph? That 'default insecure' can be bad? That 'default secure'
+%% will cause some of the users to be upset? The reason lesson is that
+%% it can't be a choice between 'insecure' and 'obnoxious'; but that's
+%% out of scope for this chapter.
+For example,
 suppose the developers of a web browser can't decide whether to support a
 given extension with unknown security implications, so they leave it as a
 user-adjustable option, thinking that users can enable or disable the
@@ -210,11 +222,12 @@
 shared versus single-user host) rather than trying to specify what the
 program should do about their situation.
 
-In privacy applications, superfluous options are even worse.  When many
-configurations, eavesdroppers and insiders can often tell users apart by
+In privacy applications, superfluous options are even worse.  When there
+are many different possible configurations, eavesdroppers and insiders
+can often tell users apart by
 which settings they choose.  For example, the Type I or ``Cypherpunk''
 anonymity network uses the OpenPGP message format, which supports many
-symmetric and asymmetric options.  Because different users may prefer
+symmetric and asymmetric ciphers.  Because different users may prefer
 different ciphers, and because different versions of the PGP and GnuPG
 implementations of OpenPGP use different ciphersuites, users with uncommon
 preferences and versions stand out from the rest, and get very little privacy
@@ -227,19 +240,20 @@
 Even when users' needs genuinely vary, adding options does not necessarily
 serve their privacy.  In practice, the default option usually prevails for
 casual users, and therefore needs to prevail for security-conscious users
-{\it even when it would not be their best choice in a vacuum.}  For example,
-when an anonymity network (like Type I does) allows user-selected message
-latency, most users tend to use whichever setting is the default, so long as
+{\it even when it would not otherwise be their best choice.}  For example,
+when an anonymity network allows user-selected message latency (like
+Type I does), most users tend to use whichever setting is the default,
+so long as
 it works.  Of the fraction of users who change the default at all, most will
 not, in fact, understand the security implications; and those few who do will
 need to decide whether the increased traffic-analysis resistance that comes
 with higher latency is worth the decreased anonymity that comes from
-splitting the bulk of the user base.
+splitting away from the bulk of the user base.
 
-* Privacy, bootstrapping, and confidence
+\section{Privacy, bootstrapping, and confidence}
 
 Another area where human factors are critical in privacy is in bootstrapping
-new systems.  Since new systems begin life with few users, they initially
+new systems.  Since new systems start out with few users, they initially
 provide only very small anonymity sets.  This creates a dilemma: a new system
 with improved privacy properties will only attract users once they believe it
 is popular and therefore has high anonymity sets; but a system cannot be

***********************************************************************
To unsubscribe, send an e-mail to majordomo@seul.org with
unsubscribe freehaven-cvs       in the body. http://freehaven.net/