Queueing analysis is used to study the wasted space that results from lazy deletions in a bucket hash table. With lazy deletions, deletions are performed only when a new item is added to the bucket. Using the assumptions underlying the G I &slash; G &slash; ∞ queue, the authors prove that if the input is Poisson ( M &slash; G &slash; ∞ ), the mean wasted space is equal to the number of buckets. If the input is not Poisson, however, the mean wasted space is no longer equal to the number of buckets. They also show that the maximum table size for the first n entries grows as o ( log n ) (in probability).
Another issue they investigate is excess space, defined as the maximum extra space needed because lazy deletions are used. They show that for the stationary M &slash; G &slash; ∞ queue, the need for excess space goes to zero (in probability) as the number of entries goes to infinity. Also proven, without the need for probabilistic assumptions on the arrival or lifetime times, is that for n entries, the probability that the excess space needed by lazy deletions is greater than or equal to b ( b > H > 1, where H is the number of buckets) is less than or equal to 2 n ( ( H + b ) &slash; 2 b ) b &slash; 2 ( ( H + b ) &slash; 2 H ) H &slash; 2. Other similar results are also proven. The bulk of the paper contains the queueing and probability arguments that are used to prove the theorems.