Ingo Weinhold wrote: > As a solution I would get rid of the stack collecting tokens of deleted > objects and let NewToken() always return a new token. For the quite > unlikely case, that the token counter hits the maximum, it should restart > from 0 again and set a flag, that from then on NewToken() must first check > whether the token is not already used. Since the token space will be used > sparsely, this should be acceptably performant. BTW, it may be a good idea > to use a hash map instead of a simple map. Axel Dörfler wrote: > Although I would consider 32-bit as enough for this time (if any > application used all 4 billion IDs, it deserves to die ;-)), but we > could as well just use 64-bit values and consider us in a safe place. > Of course, using Ingo's method would also work nicely, and since we > also have to be binary compatible with R5, this should be the best > solution :-) Just a thought; feel free to rip it to shreds. ;) The reason I included a re-use stack was to avoid the performance hit of having to search the map for an unused token. How about this compromise solution: continue to place "returned" tokens on the stack, but don't use them until the token counter maxes out. Also, use a queue instead of a stack to add a bit more separation between the return and re-issue of a given token. I will look at using a hash map instead of a regular map. e