Efficient way to move 170 GB table from production to another server
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
1
down vote
favorite
I have a 170 GB table that is no longer referenced.
I want to move that table to another server. I tried these options in staging:
- select into 8 GB table to another db - 1.47 min
- SSIS package - 6 min
Can anyone suggest best way to do it? How can I break table into parts so I can make small packages and upload it?
sql-server sql-server-2008-r2
 |Â
show 4 more comments
up vote
1
down vote
favorite
I have a 170 GB table that is no longer referenced.
I want to move that table to another server. I tried these options in staging:
- select into 8 GB table to another db - 1.47 min
- SSIS package - 6 min
Can anyone suggest best way to do it? How can I break table into parts so I can make small packages and upload it?
sql-server sql-server-2008-r2
I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
â Dan Guzman
Aug 20 at 10:06
@pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
â Biju jose
Aug 20 at 10:07
@ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
â pravin ghavare
Aug 20 at 10:14
3
Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
â Aaron Bertrandâ¦
Aug 20 at 11:09
1
What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
â Jacob H
Aug 20 at 12:35
 |Â
show 4 more comments
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I have a 170 GB table that is no longer referenced.
I want to move that table to another server. I tried these options in staging:
- select into 8 GB table to another db - 1.47 min
- SSIS package - 6 min
Can anyone suggest best way to do it? How can I break table into parts so I can make small packages and upload it?
sql-server sql-server-2008-r2
I have a 170 GB table that is no longer referenced.
I want to move that table to another server. I tried these options in staging:
- select into 8 GB table to another db - 1.47 min
- SSIS package - 6 min
Can anyone suggest best way to do it? How can I break table into parts so I can make small packages and upload it?
sql-server sql-server-2008-r2
edited Aug 20 at 11:08
Aaron Bertrandâ¦
144k19275461
144k19275461
asked Aug 20 at 8:19
pravin ghavare
65
65
I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
â Dan Guzman
Aug 20 at 10:06
@pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
â Biju jose
Aug 20 at 10:07
@ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
â pravin ghavare
Aug 20 at 10:14
3
Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
â Aaron Bertrandâ¦
Aug 20 at 11:09
1
What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
â Jacob H
Aug 20 at 12:35
 |Â
show 4 more comments
I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
â Dan Guzman
Aug 20 at 10:06
@pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
â Biju jose
Aug 20 at 10:07
@ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
â pravin ghavare
Aug 20 at 10:14
3
Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
â Aaron Bertrandâ¦
Aug 20 at 11:09
1
What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
â Jacob H
Aug 20 at 12:35
I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
â Dan Guzman
Aug 20 at 10:06
I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
â Dan Guzman
Aug 20 at 10:06
@pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
â Biju jose
Aug 20 at 10:07
@pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
â Biju jose
Aug 20 at 10:07
@ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
â pravin ghavare
Aug 20 at 10:14
@ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
â pravin ghavare
Aug 20 at 10:14
3
3
Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
â Aaron Bertrandâ¦
Aug 20 at 11:09
Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
â Aaron Bertrandâ¦
Aug 20 at 11:09
1
1
What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
â Jacob H
Aug 20 at 12:35
What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
â Jacob H
Aug 20 at 12:35
 |Â
show 4 more comments
3 Answers
3
active
oldest
votes
up vote
2
down vote
I would use:
1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.
2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)
my db size is 450 GB
â pravin ghavare
Aug 20 at 8:55
add a comment |Â
up vote
0
down vote
As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.
BACKUP DATABASE your_database
TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
WITH COPY_ONLY;
Once you have that BAK file, you can restore on your staging environment and do everything you need.
You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.
And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.
WHILE (1=1)
BEGIN
WAITFOR DELAY '00:00:01'
DELETE TOP(10000)
FROM [dbo].[myTable]
WHERE --condition here
IF @@ROWCOUNT < 1 BREAK
END
add a comment |Â
up vote
0
down vote
If you need table from Production for test purpose?
- Back up your database
WITH COPY_ONLY, COMPRESSION
and .bak file copy to another server (must need enough space and better to copy .bak over share drives)
- Restore DB to another server (restore with new name if needed)
- Drop all of the tables you don't want.
BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION
Backup the database and send that .bak over test environments.
add a comment |Â
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
I would use:
1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.
2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)
my db size is 450 GB
â pravin ghavare
Aug 20 at 8:55
add a comment |Â
up vote
2
down vote
I would use:
1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.
2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)
my db size is 450 GB
â pravin ghavare
Aug 20 at 8:55
add a comment |Â
up vote
2
down vote
up vote
2
down vote
I would use:
1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.
2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)
I would use:
1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.
2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)
answered Aug 20 at 8:29
Koen D
1215
1215
my db size is 450 GB
â pravin ghavare
Aug 20 at 8:55
add a comment |Â
my db size is 450 GB
â pravin ghavare
Aug 20 at 8:55
my db size is 450 GB
â pravin ghavare
Aug 20 at 8:55
my db size is 450 GB
â pravin ghavare
Aug 20 at 8:55
add a comment |Â
up vote
0
down vote
As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.
BACKUP DATABASE your_database
TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
WITH COPY_ONLY;
Once you have that BAK file, you can restore on your staging environment and do everything you need.
You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.
And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.
WHILE (1=1)
BEGIN
WAITFOR DELAY '00:00:01'
DELETE TOP(10000)
FROM [dbo].[myTable]
WHERE --condition here
IF @@ROWCOUNT < 1 BREAK
END
add a comment |Â
up vote
0
down vote
As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.
BACKUP DATABASE your_database
TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
WITH COPY_ONLY;
Once you have that BAK file, you can restore on your staging environment and do everything you need.
You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.
And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.
WHILE (1=1)
BEGIN
WAITFOR DELAY '00:00:01'
DELETE TOP(10000)
FROM [dbo].[myTable]
WHERE --condition here
IF @@ROWCOUNT < 1 BREAK
END
add a comment |Â
up vote
0
down vote
up vote
0
down vote
As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.
BACKUP DATABASE your_database
TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
WITH COPY_ONLY;
Once you have that BAK file, you can restore on your staging environment and do everything you need.
You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.
And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.
WHILE (1=1)
BEGIN
WAITFOR DELAY '00:00:01'
DELETE TOP(10000)
FROM [dbo].[myTable]
WHERE --condition here
IF @@ROWCOUNT < 1 BREAK
END
As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.
BACKUP DATABASE your_database
TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
WITH COPY_ONLY;
Once you have that BAK file, you can restore on your staging environment and do everything you need.
You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.
And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.
WHILE (1=1)
BEGIN
WAITFOR DELAY '00:00:01'
DELETE TOP(10000)
FROM [dbo].[myTable]
WHERE --condition here
IF @@ROWCOUNT < 1 BREAK
END
edited Aug 20 at 10:57
answered Aug 20 at 9:07
Yaroslav
1,98311333
1,98311333
add a comment |Â
add a comment |Â
up vote
0
down vote
If you need table from Production for test purpose?
- Back up your database
WITH COPY_ONLY, COMPRESSION
and .bak file copy to another server (must need enough space and better to copy .bak over share drives)
- Restore DB to another server (restore with new name if needed)
- Drop all of the tables you don't want.
BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION
Backup the database and send that .bak over test environments.
add a comment |Â
up vote
0
down vote
If you need table from Production for test purpose?
- Back up your database
WITH COPY_ONLY, COMPRESSION
and .bak file copy to another server (must need enough space and better to copy .bak over share drives)
- Restore DB to another server (restore with new name if needed)
- Drop all of the tables you don't want.
BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION
Backup the database and send that .bak over test environments.
add a comment |Â
up vote
0
down vote
up vote
0
down vote
If you need table from Production for test purpose?
- Back up your database
WITH COPY_ONLY, COMPRESSION
and .bak file copy to another server (must need enough space and better to copy .bak over share drives)
- Restore DB to another server (restore with new name if needed)
- Drop all of the tables you don't want.
BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION
Backup the database and send that .bak over test environments.
If you need table from Production for test purpose?
- Back up your database
WITH COPY_ONLY, COMPRESSION
and .bak file copy to another server (must need enough space and better to copy .bak over share drives)
- Restore DB to another server (restore with new name if needed)
- Drop all of the tables you don't want.
BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION
Backup the database and send that .bak over test environments.
answered Aug 21 at 22:38
CR241
541217
541217
add a comment |Â
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f215360%2fefficient-way-to-move-170-gb-table-from-production-to-another-server%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
â Dan Guzman
Aug 20 at 10:06
@pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
â Biju jose
Aug 20 at 10:07
@ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
â pravin ghavare
Aug 20 at 10:14
3
Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
â Aaron Bertrandâ¦
Aug 20 at 11:09
1
What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
â Jacob H
Aug 20 at 12:35