We have a database that adds over 100,000 records per day. This goes back to 2002 and I only need historical data for 6 months. Presently we can only can delete 1000 row at a time. Is there a faster way of deleting. We seem to continuely run out of disk space. Urgent!!!!!!!!!!!If you just need a small set of data from a much larger set, you are better off insert those data you need into a new table, drop the old table, and rename the new table to the old table's name.|||After deciding to pull data out of the large table going back 6 months only. I created a statement that pulls each field out and will insert into a temp table. I get this error message.
'Server: Msg 544, Level 16, State 1, Line 1
Cannot insert explicit value for identity column in table 'INCOMING_TEMP' when IDENTITY_INSERT is set to OFF.'
I'm not sure what this means. Is it that I cannot insert into a ID field?|||Check 'SET IDENTITY_INSERT' in the BOL.|||? Check the bol???|||Books... On... Line... B.O.L.|||Due to the forum being down. I did some reserch on the error message and found a solution. I went into the management properties of the new temp_table and made it able to except inserts. The update is running as we speak. It's been running about 20 minutes and that's about right. The date range hold about 1,500,000 record per month. And that's why I'm clearing it out and holding only 6 months of backup historical data. Day 2 of new project.|||What your tranny logs and tempdb....
Are you doing this in batches with transactions?
A permanent table I think would have been better
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment