[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

[freehaven-cvs] add a bunch more sections



Update of /home/freehaven/cvsroot/doc/wupss04
In directory moria.mit.edu:/tmp/cvs-serv17832

Modified Files:
	usability.tex 
Log Message:
add a bunch more sections

Index: usability.tex
===================================================================
RCS file: /home/freehaven/cvsroot/doc/wupss04/usability.tex,v
retrieving revision 1.3
retrieving revision 1.4
diff -u -d -r1.3 -r1.4
--- usability.tex	22 Oct 2004 04:10:27 -0000	1.3
+++ usability.tex	25 Oct 2004 18:59:23 -0000	1.4
@@ -56,15 +56,34 @@
 
 * How bad usability can thwart security
 
-[[Brainstorm up a big list.  Possibilities include:
-  - Useless/insecure modes of operation.
+Hard-to-use programs and protocols can hurt security in many ways:
+\begin{tightlist}
+\item Programs with {\it insecure modes of operation} are bound to be used
+  unknowningly in those modes.
+\item {\it Off switches}, once disabled, are often never re-enabled.
+\item {\it Inconvenient} security is often abandoned in the name of
+  day-to-day efficiency: people often write down difficult passwords to keep
+  from forgetting them, and share passwords in order to work together.
+\item Systems that provide a {\it false sense of security} prevent users from
+  taking real measures to protect themselves: breakable encryption on ZIP
+  archives, for example, can fool users into thinking that they don't need to
+  encrypt email containing ZIP archives.
+\item Systems that provide {\it bad mental models} for their security can
+  trick users into believing they are more safe than they really are: for
+  example, many users interpret the ``lock'' icon in their web browsers to
+  mean ``You can safely enter personal information,'' when its meaning is
+  closer to ``Nobody can read your information on its way to the named
+  website.''\footnote{Or more accurately, ``Nobody can read your information
+    on its way to someone who was able to convince one of the
+    dozens to hundreds of CAs configured in your browser that they are the
+    named website, or who was able to compromise the named website later
+    on.  Unless your computer has been compromised already.''}
+\end{tightlist}
+
+[[Brainstorm up a big list.  More possibilities include:
   - Confusion about what's really happening.
-  - Bad mental models.
-  - Too easy to exit system.
   - Too easy to social-engineer users into abandoning.
-  - Inconvenient, therefore abandoned. (People write down long passwords.)
-  - 
-  - .... XXXX NICK WRITES MORE HERE.
+  - XXXX
 ]]
 
 * Usability is even more a security parameter when it comes to privacy
@@ -159,13 +178,92 @@
 
 With security:
 \begin{tightlist}
-\item Extra options often delegate decisions to those least able to
-make them.  If the protocol designer can't decide whether to  XXX or XXX, how
-is the user supposed to choose?
-\item More choices mean more code, and more code is harder to audit.
+\item Extra options often delegate decisions to those least able to make
+  them.  If the protocol designer can't decide whether AES is better than
+  Twofish, how is the end user supposed to pick?
+\item Options make code harder to audit by increasing the volume of code, by
+  insreading the number of possible configurations {\it exponentially}, and
+  by guaranteeing that non-default configurations will receive very little
+  testing in the field.
 \end{tightlist}
 
-XXXX NICK WRITES MORE HERE
+% XXXX This next graf is to dry and verbose; make it *fun*.
+Most users stay with default configurations as long as they work, and only
+reconfigure their software as necessary to make it usable.  This often leads
+to security choices being made by those least qualified to make them, while
+giving application developers a false sense of rectitude.  For example,
+suppose the developers of a web browser can't decide whether to support a
+given extension with unknown security implications, so they leave it as a
+user-adjustable option, thinking that users can enable or disable the
+extension based on their security needs.  In reality, however, if the
+extension is enabled by default, nearly all users will leave it on whether
+it's secure or not; and if the extension is disabled by default, users will
+tend to enable it first based on their perceived demand for the extension
+rather than their security needs.  Thus, only the most savvy and
+security-conscious users---the ones who know more about web security than the
+developers themselves---will actually wind up deciding based on good
+analysis.
 
-\end{document}
+Of course, when end users {\it do} know more about their individual security
+requirements than application designers, then adding options is beneficial,
+especially when users describe their own situation (home or enterprise;
+shared versus single-user host) rather than trying to specify what the
+program should do about their situation.
+
+In privacy applications, superfluous options are even worse.  When many
+configurations, eavesdroppers and insiders can often tell users apart by
+which settings they choose.  For example, the Type I or ``Cypherpunk''
+anonymity network uses the OpenPGP message format, which supports many
+symmetric and asymmetric options.  Because different users may prefer
+different ciphers, and because different versions of the PGP and GnuPG
+implementations of OpenPGP use different ciphersuites, users with uncommon
+preferences and versions stand out from the rest, and get very little privacy
+at all.  Similarly, Type I allows users to pad their messages to a fixed size
+so that an eavesdropper can't correlate the sizes of messages passing through
+the network---but it forces the user to decide what size of padding to use!
+Unless a user can guess which padding size will happen to be most popular,
+the option provides adversaries with another way to tell users apart.
+
+Even when users' needs genuinely vary, adding options does not necessarily
+serve their privacy.  In practice, the default option usually prevails for
+casual users, and therefore needs to prevail for security-conscious users
+{\it even when it would not be their best choice in a vacuum.}  For example,
+when an anonymity network (like Type I does) allows user-selected message
+latency, most users tend to use whichever setting is the default, so long as
+it works.  Of the fraction of users who change the default at all, most will
+not, in fact, understand the security implications; and those few who do will
+need to decide whether the increased traffic-analysis resistance that comes
+with higher latency is worth the decreased anonymity that comes from
+splitting the bulk of the user base.
 
+* Privacy, bootstrapping, and confidence
+
+Another area where human factors are critical in privacy is in bootstrapping
+new systems.  Since new systems begin life with few users, they initially
+provide only very small anonymity sets.  This creates a dilemma: a new system
+with improved privacy properties will only attract users once they believe it
+is popular and therefore has high anonymity sets; but a system cannot be
+popular without attracting users.  New systems need users for privacy, but
+need privacy for users.
+
+Low-needs users can break the deadlock.  For the earliest stages of a
+system's lifetime, anonymity networks tend to build themselves with users who
+need only to resist weak adversaries who can't know which users are using the
+network and thus can't learn the contents of the small anoymity set.  This
+reverses the early adopter trends of many security systems: rather than
+attracting first the most security-conscious users, privacy applications
+must begin by attracting low-needs users and hobbyists.
+
+But this analysis relies on users' accurate perceptions of present and future
+anonymity set size.  As in market economics, expectations can bring about
+trends themselves: a privacy system which people believe to be secure and
+popular will gain users, thus becoming (all things equal) more secure and
+popular.  Thus, security depends not only on usability, but on {\it
+  perceived usability by others}, and hence on the quality of the provider's
+marketing and public relations.  Perversely, over-hyped systems (if they are
+not actually broken) may be a better choice than modestly promoted ones,
+if the hype attracts more users---a badly promoted anonymity network provides
+little anonymity.
+
+
+\end{document}

***********************************************************************
To unsubscribe, send an e-mail to majordomo@seul.org with
unsubscribe freehaven-cvs       in the body. http://freehaven.net/