[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [seul-edu] Typical costs paid by school districtsfor software ?



> Here we have room for an interesting analysis.  Take a reference
> scholastic network of workstations and servers (I'm assuming Microsoft
> here, but the same analysis could be done with Macintoshes) and measure
> the performance of selected applications such as MS Word, Excel, etc. 
> Use whatever measurements seem most appropriate (time to open a file of
> X size, time to spell-check entire document, etc.) to characterize
> performance.  Repeat same measurements on a Linux-based network.  See
> what level of hardware is necessary on both server and workstation to
> give equivalent performance to the reference network (using equivalent
> programs such as Star Office or ApplixWare, or AbiWord and Gnumeric). 
> Show cost differences between hardware necessary for equivalent
> performance.

This would basically be like the BAPCO SysMark test that BYTE uses, but
comparing across different desktop suites and platforms.

This *would* be very interesting information to have.  I would expand it
to include comparisons of apps running on various workstation/server
configs using both X11 and Windows Terminal Server (Citrix).

A lab full of 486 or low end pentiums with 32MB running StarOffice on Dual
P3s over X is pretty cost effective compared to the accepted solution of
putting P2-400s with 64MB and Windows NT/Office 97 or better on everyone's
desk.

For that matter, I am pretty sure that a 8Mhz 8088 running DOS 3.1 and IBM
DisplayWrite III would blow the doors off any Pentium class PC running
Office 97 or Office 2000 when it came to things like total time to spell
check and print a document -- certainly on the time to launch application
test!

> This may be a bit difficult to actually do, but it would be very
> interesting to have some figures of what cost differences a Linux

It wouldn't actually be that hard to do, just time consuming and tedious.  
You would have to come up with a standard set of documents/files to
process (in several formats), and figure out what operations you wanted to
time and perform on them.

You would want to have a typical small documents (like a 2 page letter,
small spreadsheet, graphical logo), a medium sized document (15 or 20 page
report/a large spreadsheet with charts/full page color graphics) and then
something really large (book sized 200 page document/huge
staticstical analysis type spreadsheet/multiple large graphic.  Throw in
embedded graphics to print, linked spreadsheets, etc.

Things to time that I can think of (as appropriate for each document):
	time to launch application and begin work
	time to spell check (automatic replace)
	time to re-paginate and print
	time to search and replace all occurances of a word
	time to re-calc spreadsheet
	time to rotate or flip graphic and print
	time to insert half page graphic in document
	time to scroll from top to bottom of file	
	time to save file and quit

Of course, the comparsions would also have to be carried out on identical
hardware under all operating systems (at least as a baseline) at some
level of performance.  Then once you had a baseline, you could then do
specific testing with small or bigger hardware and have some sort of valid
means of comparing and relating it all.

Ideally, it would be scriptable on all platforms and one could expand the
test to include increasing numbers of client workstations (5 workstations,
20 workstations, 50 workstations, etc.)

This would probably take a week or two worth of peopletime to do properly,
if one had all of the appropriate hardware and software available.  I know
that for me and probably others on the list, the problem is not the
hardware (got lots of machines and labs that could be used for this
purpose), but just not having the time available to do it.

-- 
James Troutman, Troutman & Associates - telecommunications consulting
93 Main Street, Waterville, Maine 04901 - 207-861-7067