[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [freehaven-dev] Graduated Mirroring
On Wed, Jan 26, 2000 at 08:49:11PM -0500, Ron Rivest wrote:
> Roger --
> Here is a variation on your scheme, to think about:
> -- Assume that each server knows what documents he is storing
> pieces of (since I think this is probably the case anyway).
This is fine to assume. I'm not convinced that it's reasonable --
your argument from before was that nodes could simply request files
until they noticed their fragment getting requested. I think that
there will be enough nodes and enough files that this isn't reliably
going to happen in a reasonable amount of time.
On the other hand, I have no real reason for having nodes not know
the files that their fragments belong to, so I'm fine assuming that
they have some way of figuring it out.
> -- Allow each server to store a number of fragments according
> to how important he thinks the document is.
> (Thus, we get rid of all the economics stuff.)
> -- keep the mixnet for communicating with the servers
> -- anyone can sign up as a server. no reputation measures needed
> -- If I have an important document D that I do not want to see
> suppressed, then I mail D to all of the servers (perhaps with
> a short cover letter explaining why I think it is important
> that is document not be suppressed). This goes through the
> mixnet to protect the identity of the servers and the identity
> of the document provider.
> -- Each server has a manager (a person). This person decides
> how much storage to allocate to document D, according to the
> manager's own preferences and political leanings etc.
> (I realize that this *manual* process is not present in your
> design, but I think it allows the participants (the managers)
> to donate to causes they think appropriate, and not to donate
> to others, which rather appeals to me.)
This is a reasonable approach, but I have some issues with it.
a) First of all, this isn't what I want to do. :) This is a good
service to offer, but it's separate from what I'm after. I claim
that "all data is created equal. why somebody cares about a piece
of data is his own business" -- as long as he keeps up his end of
the bargain. People should reasonably be able to post the Idaho
phone book, and nobody will really wonder why they do it. (Perhaps
they've used steganography to list the next ten names on their hit
list. Or to record how to create the next element in the periodic
table, but they won't reveal the key until they die. Or perhaps it
really is just the Idaho phone book, and they're posting it because
they expect the FBI to 'erase' somebody and doctor all the phone books.
I dunno.) A data haven is supposed to be a 'no questions asked' sort
b) Having a person hand-sort and consider each item really cuts
down on the number of people who would be willing to host a server.
I guess people could say "I'll act like that guy over there" or
similar, which would mean that there's a representative that they all
trust to make good decisions about what material they should all keep.
Which would work, I guess...
[very neat ideas snipped]
> -- When a document no longer has enough support to be reconstructed, it essentially
> disappears. Servers still holding information about that document may then
> decide to throw away whatever they have about the document. This replaces your
> notion of an expiration date, and somehow seems more natural...
The Freenet system uses this notion -- the lifetime of information is based
on its popularity. However, I still think that the publisher of the data
should decide how important it is. I don't want to trust "popular opinion at
the time" to keep my data safe. Consider data that is unpopular for a while
but then suddenly popular -- like photos of JFK Jr saluting his father, or
like a (timestamped) Idaho phone book that has those ten extra names that
the FBI is suddenly accused of erasing. I think there are enough cases where
the responsibility for deciding importance should rest in the publisher
rather than the server owner...
> Comments appreciated. This approach has some advantage in terms of simplicity;
> there are no reputations to measure, no "shepherds" for fragments, etc. The
> only complexity is having to judge what level of support to give to a document,
> but this "feels natural" to me, rather than having my machine supporting terrorists
> without my knowledge or consent, say... I guess this approach is more like
> "graduated mirroring" in the sense that you can be a "mirror" for a document
> to almost any degree. It also has mixnet protection for the identity of the
This is a really good idea. Somebody should publish it. :)
While it is a simpler idea in a lot of respects, it doesn't capture
the essence of what I'm after in a data haven. Indeed, it also goes
against my basic assumptions about computing: we've got a lot of
hardware around here, and very few people. This trend will get more
pronounced as time goes on. I think paving the way for an automated
robust data haven based on privacy of publisher/data is going to have
more of an effect in the long run.
(Perhaps one of these AUP people wants to implement this alongside
the Free Haven Project...:)