[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

[freehaven-cvs] preliminary notes and outline for usability chapter



Update of /home/freehaven/cvsroot/doc/wupss04
In directory moria.mit.edu:/home2/arma/work/freehaven/doc/wupss04

Added Files:
	outline 
Log Message:
preliminary notes and outline for usability chapter


--- NEW FILE: outline ---

O'Reilly chapter outline:

       "Anonymity loves company -- usability as a security parameter."
                      Roger Dingledine & Nick Mathewson

Piece one:
* Motivation for anonymity networks. Talk about the range of deployed
  systems out there.
* Anonymity is unique in that you can get better security by behaving
  like others.
* Some others in this book are talking about how to achieve usability.
  Here we're going to talk about the importance of usability (and users)
  in getting our security.

Piece two:
What else is quite like anonymity? Encryption? How careful people are
in certifying other GPG keys? Internet host security to prevent DDoS
and spam against others?

Piece three:
Let's give you an overview of what we're trying to tell you here, so
you'll have it in mind when we explain some example systems.

Piece four:
Example systems. Cypherpunks remailers. PGP/GPG. Tor (JAP, "trust us"
proxies). Mixminion. Their basic architectures, and how deployed they are.

Piece five:
Observations, Recommendations, Open research questions.
- Users' safety relies on them behaving like others. How do they predict?
  What if they need to behave their certain way? How do they compute the
  tradeoff and risks?
- Don't try to get the user to answer questions you can't answer yourself.
- The importance of choosing good defaults: since most people will use
  the defaults, you've made the decision for everybody.
- Especially messy because even the researchers don't know the answers,
  and don't understand the tradeoffs. E.g., who is the adversary really,
  and what can they do?
- The importance of smart users / educating your users. Public perception
  as a security parameter. Good marketing as a security parameter?
- Not just about numbers and blending, also about reputability. A network
  used only by criminals is not the one you want.
- The importance of a GUI. Users evaluate the quality of a product by the
  quality of its GUI. Cf Tor's choice not to have a gui so far. They also
  judge quality based on feature-lists, which is unsafe.
- Bootstrapping. How do we get any users?  High-needs users are never
  first joiners.  Low-needs users won't join if it meets the high-needs
  users' needs.




Notes:

- Start with some talk on anonymity; software aimed to provide anonymity;
  etc. Distinguish real anonymity from accidental anonymity.

  (Accidental security is what you have when your host has not in fact
  been hacked yet.)

- Key point: In anonymity, even if you were smart enough and had enough time
  to use every conceivable system perfectly, you would *nevertheless* be
  right to choose your system based in part on its usability.

- Brainstorm: What else besides anonymity systems has the property
  that lowered usability lowers _everyone's_ security, and/or that I am
  better off behaving like everyone else than being different?

     - A more usable PGP-lite would make more of the mail sent to me
       confidential, and more of the mail I send confidential, thus
       improving my security.  But I'd still use a more secure PGP when
       I could.  This doesn't have the property that I should switch to
       always use PGP-lite.

     - Also if there's a PGP-heavier that's better than PGP if used right,
       but where it's easy to screw up and use bad encryption, maybe
       I shouldn't even generate a PGP-heavier key.  But this doesn't
       have the property that even if all the smart people (who don't
       screw up) use PGP-heavier, they're still worse off.

     - Internet security analogy: I should make sure everybody can have
       a good usable firewall, so I can be more DDOS resistant.

     - What else?  I worry that people will just say, "I'm not writing an
       anonymity system!" and skip the chapter. Hmn.

- Privacy/anonymity is like security in that there is no sharp division
  between "privacy" apps and others.  Any app can leak privacy, so if
  any of your users care about privacy, you should too.

  - Security *is* a prerequisite for privacy: data security often means
    privacy.  Most security breaks impact privacy.

    - (Security is "Even when others try to make your stuff do something that
      you wouldn't want it to, your stuff does what you want.")

    - hypothesis: Most security breaks have obvious and easy-to-articulate
      impact; many privacy breaks have slow, cumulative impact.

- Other people will talk about how to get usability.  We'll talk about what
  kinds of usability are good for security and what are bad.

- From privacy and anonymity POV, getting users to appear similar is vital.
  "Features" are often the enemy of this.  "Power-user mode" is often the
  enemy of this.

- In a confidentiality system, a secure use is always secure.  But in an
  anonymity system, your security is limited by the number of people who
  have sufficiently similar security parameters.

- Hard-to-use systems lead to multiple weirdo oddball frontends lead to
  differing user behavior.

- Sometimes users with more specialized needs are better off having those
  needs unsatisfied, if satisfying them would require splitting from the
  crowd.

- Don't try to get the user to answer questions you can't answer yourself.
  This is "bad options".

  - Does the property pertain only to the user's situation?  If so, try to
    chunk users into as few classes as possible.  Ask whether people in
    smaller classes wouldn't be better joining with others, despite their
    different needs.

    - What *is* a better/more 'usable' system here? Is it the one that does
      what I tell it, or the one that keeps me anonymous? Maybe it gives me
      a choice, but gives me the _right_ choice, which is what-I-want vs
      anonymity.  How do I communicate the idea of "how secure"?  How do I
      even understand what it is I *want* to communicate?

  - Does the property pertain to everyone?  If you can't make a decision,
    why think your average user will?  Here's what happens, typically:
      95% of users use the defaults unless they break.
      >95% of the remaining users don't have any more clue than you which
        option is better.  Most will choose weird divergent combinations
        of settings to make themselves distinguishable.
      Of the remaining <1/400 users, they have to decide: will the increased
        security you get with the optimal settings outweigh the anonymity
        you lose by being in a tiny set?  If they're smart, they'll conclude
        that the right choice isn't changing the settings; it's getting you
        to patch the next version to do the right thing.

  - Your choice of defaults would still matter most, even if you give
    options.  If you choose a secure default, >95% of the people are ok.  If
    you choose a bad default, *everybody* is hurt, even the people who choose
    better than you.

  - When designing protocols, assume that every option you give the
    implementor will be passed on to the user in an obscure preferences
    menu.  Implementors are often dumb this way.

  - Say something about how this is hard stuff where sometimes there aren't
    "good" solutions.  But go on to say that this doesn't let you the
    designer off the hook.

- Systems/ideas to discuss:

  - Cpunk

  - PGP/GPG for pseudonymity

  - Low-latency versus high-latency: who knows?

- On low-latency vs high-latency: If your attacker can beat LL, you should
  go with HL always and hope that others do.  But if your attacker can't
  beat LL, you should just use it...maybe even when you could afford HL.

- Limiting factor, of course, is that users _want_ to be distinguishable in
  their actions.  They d/l different web pages; they post messages with
  different contents.

- The more cancer survivors on tor, the better for the activists.  This
  is a reputability issue, not anonymity.

  - "innocuousness"

  - Just me: i'm screwed.  Add a thousand communists, and I'm anonymous, but
    everyone thinks I'm a commie.  Add a thousand J Random Citizens (cancer
    survivors, privacy enthusiasts, etc) and I'm not profile-able.

  - Maybe the cancer survivors want some commies on the system too?  They'd
    prefer J Random Non-cancer-survivors.

  - Public perception is a security parameter (tor-design)

- Other people are using chapters to plug their usability.  If we plug
  anything, it's importance of our approach.


- Look at econymics paper; recap ideas; add caveats to results:

   - We claim that people who care a lot will run servers... but that makes
     them higher profile... which probably they don't want.

   - Roger says results don't extend to low-latency.  Nick disagrees.

     - You should use the most popular system where your adversary doesn't
       win.

     - There is incentive in both cases to run a server.


Other human factors:

- Evaluating business model of provider is important.  Advertising is
  security.  Marketing as a security parameter.

- ZKS didn't convince people that "don't trust us" was valid business idea.
  Advertising is security!!!!

  - Aside: why didn't people go for it ("don't trust us")?  Maybe because
    the unsophisticated user *had* to trust ZKS (since users couldn't
    evaluate ZKS claims)

- Users make security decisions based on pretty blinkenlights and long
  feature lists.  But long feature lists are a bad bad idea.  Fear.

- How are *we* supposed to know what the adversaries are??

- Not everything is as good as encryption; adding more bits for overkill
  doesn't seem to work anywhere else.

- Roger wants a Standard Econ Graph.  ("Look! Crossing curves! Econ is a
  science, Dammit!")

Other stuff to say:

- Why it's so hard to estimate anonymity
  - Sybil attack
  - freeloaders and why you can't easily detect them

- Why bootstrapping is hard: how do I get anybody on the system?  High-needs
  users are never first joiners.  Low-needs users won't join if it meets the
  high-needs users' needs.


***********************************************************************
To unsubscribe, send an e-mail to majordomo@seul.org with
unsubscribe freehaven-cvs       in the body. http://freehaven.net/