[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

[school-discuss] On site cache of web pages



We have a number of older PCs that are out of service.  We thought of
using them as a squid cache.  However, I am not sure I understand what
all of that entails, whether a cluster would be appropriate to this
end, or just how to implement that.

I am a seasoned GNU/Linux *user*, and could probably figure out the
nuts and bolts, but so far, the squid documentation is cryptic to me.
I would appreciate pointers, and advice.

Here is our problem:  We have a large number of machines in our High
School campus, and a PRETTY good (perhaps) Internet connection---given
that the firewall that our school system operates is a contentious
beast.  We have a distance ed facility and many windows PCs with
viruses and spyware running rampant.  We do have a good number of G4
apple powerbooks, for student use.  The current bandwidth is very
limited.  During work hours, the system can be unuseable, or if I am
lucky I can get 5 or 6 Kbit bandwidth---which is on the surface as
good as my dialup at home, but in practice can (mysteriously, to me)
be somewhat slower!.  So streaming video or anything useful on the
Inet is impractical.

My Science Department would like to set up a cache server that would
enable us to run, for example, a Treasure Hunt or a class research
exercise.  Many students would hit the same pages, so if they were
cached on site, bandwidth would be less of an issue.

I and my colleagues, who are more and more receptive to GNU/Linux all
the time, would appreciate any suggestions.

Alan Davis
Saipan, N. Mariana Islands