[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Sv: Sv: A little more exact specification of the hash-table

>>The O notation always means "in the order of ...", i.e.
>>O(1) is constant (any constant, no matter how big),
>>O(n) is linear (i.e. 1*n or 1000000000*n, doesn't matter)
>Correction: the above is a bad explanation ;)
>The O notation describes how something (usually processing time) roughly
>behaves depending on a value n (usually input size). Thus,
>O(1) means processing time is independent from the input size, constant
>O(n) means processing time increases linearly with the input size
>O(n^2) means ... increases quadratically with the input size
>etc. That's the *only* thing the O notation expresses.
The big-O notation doens't support constants. That's the whole point of
big-O. Big-O doesn't make sense with regard to constant-time performance.
O(1) might be correct, I don't know, but then its a special case that
atleast my book on the subject doesn't mention.

I just say "constant-time performance" to avoid the problem.

The simpler question, of course, is "does it really matter?"  A rather not
so simple question is "why didn't I just shut up about it in the first case
if it really didn't matter?" ... :)