Hello,
I have to change a datatype (int to bigint) for a column in a table with
over 10million rows
and don't have a lot of log space to deal with.
What's the best method for achieving this task with the least amount of
logging?
Any help appreciated.
Thanks in advance!
Here are two ideas. Both should be preceded by a full backup IMHO.
Change your recovery mode to simple, add a new nullable BIGINT column, then
run an UPDATE in a loop, truncating the log each iteration.
SET ROWCOUNT 10000;
SELECT 'starting...';
WHILE @.@.ROWCOUNT > 1
BEGIN
UPDATE table SET BigIntColumn = IntColumn WHERE BigIntColumn IS NULL;
END
SELECT '...finished';
Then you can drop the old column (you will need to drop
constraints/indexes/schemabound views/functions etc. first) and rename the
new one.
ALTER TABLE table DROP COLUMN IntColumn;
EXEC sp_rename 'table.BigIntColumn', 'IntColumn', 'COLUMN';
To be safe if you have any views that point I would DROP/CREATE or run
sp_refreshview. You didn't say what version of SQL Server you were using...
there may be other factors / consequences...
If you can take the table offline for an extended amount of time, you could
build an almost identical table (the int column changed to bigint) on
another system (which does have the room to duplicate the table), then copy
the data over to the new table, drop the existing table, create the same
table (with int changed to bigint) and copy the data back (there are wizards
and/or DTS/SSIS for this task).
"Mark" <Mark@.discussions.microsoft.com> wrote in message
news:16AE2454-32A0-46B6-A0DB-01580391562D@.microsoft.com...
> Hello,
> I have to change a datatype (int to bigint) for a column in a table with
> over 10million rows
> and don't have a lot of log space to deal with.
> What's the best method for achieving this task with the least amount of
> logging?
> Any help appreciated.
> Thanks in advance!
>
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment