# Re: A little more exact specification of the hash-table

```Bjarke Hammersholt Roune wrote:
>>Correction: the above is a bad explanation ;)
>>The O notation describes how something (usually processing time) roughly
>>behaves depending on a value n (usually input size). Thus,
>>O(1) means processing time is independent from the input size, constant
>>O(n) means processing time increases linearly with the input size
>>O(n^2) means ... increases quadratically with the input size
>>
>>etc. That's the *only* thing the O notation expresses.
>>
>The big-O notation doens't support constants. That's the whole point of
>big-O. Big-O doesn't make sense with regard to constant-time performance.

It does (at least according to my books (notice the plural) and my CS
professor).
If T(n) is the required processing time, then
T(n) is in O(n) means the same as
for all n: T(n) <= c1 * n + c2
(c1 and c2 being constants)
This makes perfect sense for O(1).

>The simpler question, of course, is "does it really matter?"  A rather not
>so simple question is "why didn't I just shut up about it in the first case
>if it really didn't matter?" ... :)

*grin*

Christian
--

Drive A: not responding...Formatting C: instead

```