Efficient way to move 170 GB table from production to another server



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
1
down vote

favorite












I have a 170 GB table that is no longer referenced.



I want to move that table to another server. I tried these options in staging:



  • select into 8 GB table to another db - 1.47 min

  • SSIS package - 6 min

Can anyone suggest best way to do it? How can I break table into parts so I can make small packages and upload it?







share|improve this question






















  • I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
    – Dan Guzman
    Aug 20 at 10:06










  • @pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
    – Biju jose
    Aug 20 at 10:07










  • @ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
    – pravin ghavare
    Aug 20 at 10:14






  • 3




    Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
    – Aaron Bertrand♦
    Aug 20 at 11:09






  • 1




    What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
    – Jacob H
    Aug 20 at 12:35

















up vote
1
down vote

favorite












I have a 170 GB table that is no longer referenced.



I want to move that table to another server. I tried these options in staging:



  • select into 8 GB table to another db - 1.47 min

  • SSIS package - 6 min

Can anyone suggest best way to do it? How can I break table into parts so I can make small packages and upload it?







share|improve this question






















  • I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
    – Dan Guzman
    Aug 20 at 10:06










  • @pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
    – Biju jose
    Aug 20 at 10:07










  • @ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
    – pravin ghavare
    Aug 20 at 10:14






  • 3




    Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
    – Aaron Bertrand♦
    Aug 20 at 11:09






  • 1




    What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
    – Jacob H
    Aug 20 at 12:35













up vote
1
down vote

favorite









up vote
1
down vote

favorite











I have a 170 GB table that is no longer referenced.



I want to move that table to another server. I tried these options in staging:



  • select into 8 GB table to another db - 1.47 min

  • SSIS package - 6 min

Can anyone suggest best way to do it? How can I break table into parts so I can make small packages and upload it?







share|improve this question














I have a 170 GB table that is no longer referenced.



I want to move that table to another server. I tried these options in staging:



  • select into 8 GB table to another db - 1.47 min

  • SSIS package - 6 min

Can anyone suggest best way to do it? How can I break table into parts so I can make small packages and upload it?









share|improve this question













share|improve this question




share|improve this question








edited Aug 20 at 11:08









Aaron Bertrand♦

144k19275461




144k19275461










asked Aug 20 at 8:19









pravin ghavare

65




65











  • I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
    – Dan Guzman
    Aug 20 at 10:06










  • @pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
    – Biju jose
    Aug 20 at 10:07










  • @ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
    – pravin ghavare
    Aug 20 at 10:14






  • 3




    Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
    – Aaron Bertrand♦
    Aug 20 at 11:09






  • 1




    What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
    – Jacob H
    Aug 20 at 12:35

















  • I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
    – Dan Guzman
    Aug 20 at 10:06










  • @pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
    – Biju jose
    Aug 20 at 10:07










  • @ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
    – pravin ghavare
    Aug 20 at 10:14






  • 3




    Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
    – Aaron Bertrand♦
    Aug 20 at 11:09






  • 1




    What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
    – Jacob H
    Aug 20 at 12:35
















I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
– Dan Guzman
Aug 20 at 10:06




I would use the SSIS package as 6 minutes seems reasonable. Specify the fast load option along with a rows per batch (empty) and max insert commit size (100000).
– Dan Guzman
Aug 20 at 10:06












@pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
– Biju jose
Aug 20 at 10:07




@pravin gharvare , Is the table size you mentioned includes indexes also? Is scripting out the values and doing an import will acceptable for you?
– Biju jose
Aug 20 at 10:07












@ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
– pravin ghavare
Aug 20 at 10:14




@ Biju jose table don't have any indexes, i have tried import export on 8 GB table on staging its taking 10 min time
– pravin ghavare
Aug 20 at 10:14




3




3




Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
– Aaron Bertrand♦
Aug 20 at 11:09




Do you not already have a backup somewhere where this table is in a reasonably current state? It's quite possible you have solutions here that don't involve touching the live production data.
– Aaron Bertrand♦
Aug 20 at 11:09




1




1




What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
– Jacob H
Aug 20 at 12:35





What is the real constraint here? Disk space? Network pipe (you mention uploading data)? No backup? If the table is no longer in use then what is the issue with running a 6 minute SSIS job to migrate the data? I feel like we're missing some information.
– Jacob H
Aug 20 at 12:35











3 Answers
3






active

oldest

votes

















up vote
2
down vote













I would use:



1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.



2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)






share|improve this answer




















  • my db size is 450 GB
    – pravin ghavare
    Aug 20 at 8:55

















up vote
0
down vote













As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.



BACKUP DATABASE your_database
TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
WITH COPY_ONLY;


Once you have that BAK file, you can restore on your staging environment and do everything you need.



You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.



And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.



WHILE (1=1)
BEGIN
WAITFOR DELAY '00:00:01'

DELETE TOP(10000)
FROM [dbo].[myTable]
WHERE --condition here

IF @@ROWCOUNT < 1 BREAK
END





share|improve this answer





























    up vote
    0
    down vote













    If you need table from Production for test purpose?




    • Back up your database WITH COPY_ONLY, COMPRESSION and .bak file copy to another server (must need enough space and better to copy .bak over share drives)

    • Restore DB to another server (restore with new name if needed)

    • Drop all of the tables you don't want.



    BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION 


    Backup the database and send that .bak over test environments.






    share|improve this answer




















      Your Answer







      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "182"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      convertImagesToLinks: false,
      noModals: false,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













       

      draft saved


      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f215360%2fefficient-way-to-move-170-gb-table-from-production-to-another-server%23new-answer', 'question_page');

      );

      Post as a guest






























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes








      up vote
      2
      down vote













      I would use:



      1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.



      2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)






      share|improve this answer




















      • my db size is 450 GB
        – pravin ghavare
        Aug 20 at 8:55














      up vote
      2
      down vote













      I would use:



      1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.



      2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)






      share|improve this answer




















      • my db size is 450 GB
        – pravin ghavare
        Aug 20 at 8:55












      up vote
      2
      down vote










      up vote
      2
      down vote









      I would use:



      1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.



      2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)






      share|improve this answer












      I would use:



      1) backup (copy-only) & restore the database on the same server to make this db available locally -> copy records into the other database. No option if the database itself is too large.



      2) use bcp out/in. Use QueryOut if you want to specify the records to be exported yourself (partition on PK value)







      share|improve this answer












      share|improve this answer



      share|improve this answer










      answered Aug 20 at 8:29









      Koen D

      1215




      1215











      • my db size is 450 GB
        – pravin ghavare
        Aug 20 at 8:55
















      • my db size is 450 GB
        – pravin ghavare
        Aug 20 at 8:55















      my db size is 450 GB
      – pravin ghavare
      Aug 20 at 8:55




      my db size is 450 GB
      – pravin ghavare
      Aug 20 at 8:55












      up vote
      0
      down vote













      As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.



      BACKUP DATABASE your_database
      TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
      WITH COPY_ONLY;


      Once you have that BAK file, you can restore on your staging environment and do everything you need.



      You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.



      And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.



      WHILE (1=1)
      BEGIN
      WAITFOR DELAY '00:00:01'

      DELETE TOP(10000)
      FROM [dbo].[myTable]
      WHERE --condition here

      IF @@ROWCOUNT < 1 BREAK
      END





      share|improve this answer


























        up vote
        0
        down vote













        As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.



        BACKUP DATABASE your_database
        TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
        WITH COPY_ONLY;


        Once you have that BAK file, you can restore on your staging environment and do everything you need.



        You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.



        And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.



        WHILE (1=1)
        BEGIN
        WAITFOR DELAY '00:00:01'

        DELETE TOP(10000)
        FROM [dbo].[myTable]
        WHERE --condition here

        IF @@ROWCOUNT < 1 BREAK
        END





        share|improve this answer
























          up vote
          0
          down vote










          up vote
          0
          down vote









          As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.



          BACKUP DATABASE your_database
          TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
          WITH COPY_ONLY;


          Once you have that BAK file, you can restore on your staging environment and do everything you need.



          You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.



          And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.



          WHILE (1=1)
          BEGIN
          WAITFOR DELAY '00:00:01'

          DELETE TOP(10000)
          FROM [dbo].[myTable]
          WHERE --condition here

          IF @@ROWCOUNT < 1 BREAK
          END





          share|improve this answer














          As recommended on the other answer, I would suggest using a backup, maybe you already have a FULL backup from last night ready to be used (if not, I will suggest to immediately configure a job to run FULL backups at least on a weekly basis). You could run a backup with copy-only option that allows you to run a backup and don't break your backup chain.



          BACKUP DATABASE your_database
          TO DISK = '\your_driveBAKyour_database_copy_only_backup.bak'
          WITH COPY_ONLY;


          Once you have that BAK file, you can restore on your staging environment and do everything you need.



          You commented that you database is 450GB, why that could be a problem? Don't you have enough space on the staging environment? If that is the case, then you will have a somewhat different approach. Maybe you could create a job that moves data from your prod server to the staging server and let it run in hours where your server has low traffic. If the server is not under heavy load, you could have 2 schedules, one that runs during the day, moving less amount of data and one at night moving more data.



          And I guess once you have that table available on staging you will want to delete it from your PROD environment. Check this answer for some advice on how to delete it and recover the space, if needed. The code there refers to UPDATES, but for DELETE's is the same, here a sample to run a delete in chunks.



          WHILE (1=1)
          BEGIN
          WAITFOR DELAY '00:00:01'

          DELETE TOP(10000)
          FROM [dbo].[myTable]
          WHERE --condition here

          IF @@ROWCOUNT < 1 BREAK
          END






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Aug 20 at 10:57

























          answered Aug 20 at 9:07









          Yaroslav

          1,98311333




          1,98311333




















              up vote
              0
              down vote













              If you need table from Production for test purpose?




              • Back up your database WITH COPY_ONLY, COMPRESSION and .bak file copy to another server (must need enough space and better to copy .bak over share drives)

              • Restore DB to another server (restore with new name if needed)

              • Drop all of the tables you don't want.



              BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION 


              Backup the database and send that .bak over test environments.






              share|improve this answer
























                up vote
                0
                down vote













                If you need table from Production for test purpose?




                • Back up your database WITH COPY_ONLY, COMPRESSION and .bak file copy to another server (must need enough space and better to copy .bak over share drives)

                • Restore DB to another server (restore with new name if needed)

                • Drop all of the tables you don't want.



                BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION 


                Backup the database and send that .bak over test environments.






                share|improve this answer






















                  up vote
                  0
                  down vote










                  up vote
                  0
                  down vote









                  If you need table from Production for test purpose?




                  • Back up your database WITH COPY_ONLY, COMPRESSION and .bak file copy to another server (must need enough space and better to copy .bak over share drives)

                  • Restore DB to another server (restore with new name if needed)

                  • Drop all of the tables you don't want.



                  BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION 


                  Backup the database and send that .bak over test environments.






                  share|improve this answer












                  If you need table from Production for test purpose?




                  • Back up your database WITH COPY_ONLY, COMPRESSION and .bak file copy to another server (must need enough space and better to copy .bak over share drives)

                  • Restore DB to another server (restore with new name if needed)

                  • Drop all of the tables you don't want.



                  BACKUP DATABASE db_name TO DISK = db_filename WITH COPY_ONLY, COMPRESSION 


                  Backup the database and send that .bak over test environments.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Aug 21 at 22:38









                  CR241

                  541217




                  541217



























                       

                      draft saved


                      draft discarded















































                       


                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f215360%2fefficient-way-to-move-170-gb-table-from-production-to-another-server%23new-answer', 'question_page');

                      );

                      Post as a guest














































































                      Popular posts from this blog

                      𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

                      Edmonton

                      Crossroads (UK TV series)