I have desinged a database for a client that become so popular that they are pulling data into it from 2 other databases at remote sites so as to keep all the latest and greatest info in this ‘Flagship’ database. Because of the huge diferences between the three schemas, I wrote a bunch of queries that pulls data from the outside sources, and completely writes over the local version, ensuring that if any field had changed in any record, it would be properly updated in our local file.
Now the customer has begun updating every hour! That translates to approx 160k records per day being updated! Not so bad until you consider that each one is taking up a new AutoNumber in the local tables. So for the first time, I am wondering how to quickly prevent them from hitting the 2.1 billion number limit.
If my math is correct, 2.1 billion divided by 160 thousand gives me approx 35.96 years to address this. But I am afraid to wait…(Think Y2K)
Got any ideas?
Thanks,
Rich