[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [seul-edu] [Fwd: Child Internet protection act CIPA]



On Mon, Nov 12, 2001 at 10:49:57PM -0800, Karsten M. Self wrote:

> My fix would be a squid proxy, with logs checked reasonably frequently.

... or better still, install something like SquidGuard which can be
used to filter on the fly. Content filtering's a broken concept that
doesn't work, but if it's for the sake of pleasing the powers that be, 
SquidGuard seems to do the job. It's extremely quick too.

> A suddenly "popular" site with inappropriate content would result in a
> talking to (and on repeat:  suspension of privileges), and a firewall
> DENY entry.  Squid logs requests by IP.  This works better on a static
> IP assignment than dynamic, as it makes tracking who's downloading what
> easier.  It also requires a bit of work -- the site admin should look
> through the top 10 or 20 site URLs.

You see, that's just a pain. I mean, having to read through the SQUID
access logs is a pain. There are some great log analyzers out there,
but having Administrators sit through checking sessions is just bad.

You're better off getting your users to authenticate to the cache
before using it. That way, SQUID'll log requests with the username
there instead of the IP.

Alfred (http://www.alfred.cx/) does this kind of stuff (and more --
it's main claim to fame is the quota extensions to squid). I'm
currently maintaining and developing Alfred. Feel free to send
questions my way (the website is currently fairly boring due to a lack 
of time on my part :-)

> I like cutting through the logs with a script that returns just the host
> for a given site -- fewer pages you have to look at.  Most sex sites
> aren't buried deep in the URL anyway -- the first page should give you
> a pretty good idea of what's there, though, naturally, if there's a lot
> of traffic at some deep level, this may bear investigating.

That's a good idea. Can you send us a copy of the script to look at?
Something that runs as a CRON job every night (emailing the results to 
the librarian or something) that lists users that have been doing
suspicious things would be good. That could then by tied into
something that lists all the domains that users have been accessing
when further investigation is required.

   - andrew

-- 
Andrew J. Reid                    "Catapultam habeo. Nisi pecuniam omnem  
andrew.reid@plug.cx               mihi dabis, ad caput tuum saxum immane 
+61 401 946 813                   mittam"