[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: Google 403 error pages

On 12/14/05, Matthias Fischmann <fis@xxxxxxxxxxxxxxxxx> wrote:
> On Sun, Dec 11, 2005 at 09:33:54PM +0200, maillist wrote:
> > To: or-talk@xxxxxxxxxxxxx
> > From: maillist <maillist@xxxxxxxxxxxx>
> > Date: Sun, 11 Dec 2005 21:33:54 +0200
> > Subject: Re: Google 403 error pages
> >
> > Hash: SHA1
> >
> > http://wiki.noreply.org/noreply/TheOnionRouter/TorFAQ#head-24b240c0329b118e4947357fb584b5579804b2ef
> the easiest way to fix this would be for google to establish a
> whitelist of machines that are unlikely to suffer from virusses (ie.,
> a list of all google exit nodes and perhaps others), and not to send
> warnings to those.  any reason why google should be refusing to
> implement this?
> m.

Does the web API work?  If so, you can always use that.

Anyway, I think the 403 page has more to do with Google not wanting
people bulk-scraping its site.  Letting exit nodes make unlimited
requests would defeat its crude protection against that, and in a way
where the violators couldn't even be easily traced.

Yeah, they could establish some sort of scheme where you prove your
identity to Google and then get a cookie or login or something to get
past the IP check, but c'mon, the cost would be too high for such a
few people.