How do I move data from RDS of one AWS account to another account










2















We have our web services and database set up on AWS a while back and application is now in production. For some reason, we need to terminate the old AWS and move everything under a newly created AWS account. Application and all the infrastructure are pretty straightforward. It is trickier for data though. The current database is still receiving lots of data on daily basis. So it is best to migrate the data after we turn off the old application and switch on new platform.



Both source RDS and target RDS are Postgres. We have about 40GB data to transfer. There are three approaches I could think of and they all have drawbacks.



  1. Take a snapshot of the first RDS and restore it in second one. Problem is I don't need to transfer all the data from source to destination. Probably just records after 10/01 is enough. Also snapshot works best to restore in an empty rds that is just created. For our case, the new RDS will start receiving data already after the cutoff. Only after that, the data will be transferred from old account to new account otherwise we will lose data.

  2. Dump data from tables in old RDS and backup in new RDS. This will have the same problem as #1. Also, if I dump data to local machine and then back up from local, the network speed is bottleneck.

  3. Export table data to csv files and import to new RDS. The advantage is this method allows pick and choose and some data cleaning as well. But it takes forever to export a big fact table to local csv file. Another problem is, for some of the tables, I have surrogate row IDs which are serial (auto-incremental). The row IDs of exported csv may conflicting with existing data in new RDS tables.

I wonder if there is a better way to do it. Maybe some ETL tool AWS has which does point to point direct transfer without involving using local computer as the middle point.










share|improve this question

















  • 1





    40GB doesn't seem like a lot of data, but take a look at Database Migration Service. It can do homogeneous migrations: aws.amazon.com/dms Not sure how easy it would be to filter rows by date though.

    – jarmod
    Nov 13 '18 at 22:41











  • @jarmod Tried Database Migration Service. It works pretty well to copy data from source table to target table that is empty. If the target table already has records, the transfer task would fail due to conflicting row ID. Like I mentioned the row ID is autogenerated incrementally using a sequence. Is there a way to work around this

    – ddd
    Nov 14 '18 at 20:04















2















We have our web services and database set up on AWS a while back and application is now in production. For some reason, we need to terminate the old AWS and move everything under a newly created AWS account. Application and all the infrastructure are pretty straightforward. It is trickier for data though. The current database is still receiving lots of data on daily basis. So it is best to migrate the data after we turn off the old application and switch on new platform.



Both source RDS and target RDS are Postgres. We have about 40GB data to transfer. There are three approaches I could think of and they all have drawbacks.



  1. Take a snapshot of the first RDS and restore it in second one. Problem is I don't need to transfer all the data from source to destination. Probably just records after 10/01 is enough. Also snapshot works best to restore in an empty rds that is just created. For our case, the new RDS will start receiving data already after the cutoff. Only after that, the data will be transferred from old account to new account otherwise we will lose data.

  2. Dump data from tables in old RDS and backup in new RDS. This will have the same problem as #1. Also, if I dump data to local machine and then back up from local, the network speed is bottleneck.

  3. Export table data to csv files and import to new RDS. The advantage is this method allows pick and choose and some data cleaning as well. But it takes forever to export a big fact table to local csv file. Another problem is, for some of the tables, I have surrogate row IDs which are serial (auto-incremental). The row IDs of exported csv may conflicting with existing data in new RDS tables.

I wonder if there is a better way to do it. Maybe some ETL tool AWS has which does point to point direct transfer without involving using local computer as the middle point.










share|improve this question

















  • 1





    40GB doesn't seem like a lot of data, but take a look at Database Migration Service. It can do homogeneous migrations: aws.amazon.com/dms Not sure how easy it would be to filter rows by date though.

    – jarmod
    Nov 13 '18 at 22:41











  • @jarmod Tried Database Migration Service. It works pretty well to copy data from source table to target table that is empty. If the target table already has records, the transfer task would fail due to conflicting row ID. Like I mentioned the row ID is autogenerated incrementally using a sequence. Is there a way to work around this

    – ddd
    Nov 14 '18 at 20:04













2












2








2








We have our web services and database set up on AWS a while back and application is now in production. For some reason, we need to terminate the old AWS and move everything under a newly created AWS account. Application and all the infrastructure are pretty straightforward. It is trickier for data though. The current database is still receiving lots of data on daily basis. So it is best to migrate the data after we turn off the old application and switch on new platform.



Both source RDS and target RDS are Postgres. We have about 40GB data to transfer. There are three approaches I could think of and they all have drawbacks.



  1. Take a snapshot of the first RDS and restore it in second one. Problem is I don't need to transfer all the data from source to destination. Probably just records after 10/01 is enough. Also snapshot works best to restore in an empty rds that is just created. For our case, the new RDS will start receiving data already after the cutoff. Only after that, the data will be transferred from old account to new account otherwise we will lose data.

  2. Dump data from tables in old RDS and backup in new RDS. This will have the same problem as #1. Also, if I dump data to local machine and then back up from local, the network speed is bottleneck.

  3. Export table data to csv files and import to new RDS. The advantage is this method allows pick and choose and some data cleaning as well. But it takes forever to export a big fact table to local csv file. Another problem is, for some of the tables, I have surrogate row IDs which are serial (auto-incremental). The row IDs of exported csv may conflicting with existing data in new RDS tables.

I wonder if there is a better way to do it. Maybe some ETL tool AWS has which does point to point direct transfer without involving using local computer as the middle point.










share|improve this question














We have our web services and database set up on AWS a while back and application is now in production. For some reason, we need to terminate the old AWS and move everything under a newly created AWS account. Application and all the infrastructure are pretty straightforward. It is trickier for data though. The current database is still receiving lots of data on daily basis. So it is best to migrate the data after we turn off the old application and switch on new platform.



Both source RDS and target RDS are Postgres. We have about 40GB data to transfer. There are three approaches I could think of and they all have drawbacks.



  1. Take a snapshot of the first RDS and restore it in second one. Problem is I don't need to transfer all the data from source to destination. Probably just records after 10/01 is enough. Also snapshot works best to restore in an empty rds that is just created. For our case, the new RDS will start receiving data already after the cutoff. Only after that, the data will be transferred from old account to new account otherwise we will lose data.

  2. Dump data from tables in old RDS and backup in new RDS. This will have the same problem as #1. Also, if I dump data to local machine and then back up from local, the network speed is bottleneck.

  3. Export table data to csv files and import to new RDS. The advantage is this method allows pick and choose and some data cleaning as well. But it takes forever to export a big fact table to local csv file. Another problem is, for some of the tables, I have surrogate row IDs which are serial (auto-incremental). The row IDs of exported csv may conflicting with existing data in new RDS tables.

I wonder if there is a better way to do it. Maybe some ETL tool AWS has which does point to point direct transfer without involving using local computer as the middle point.







postgresql amazon-web-services etl amazon-rds data-migration






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Nov 13 '18 at 22:22









dddddd

1,18022354




1,18022354







  • 1





    40GB doesn't seem like a lot of data, but take a look at Database Migration Service. It can do homogeneous migrations: aws.amazon.com/dms Not sure how easy it would be to filter rows by date though.

    – jarmod
    Nov 13 '18 at 22:41











  • @jarmod Tried Database Migration Service. It works pretty well to copy data from source table to target table that is empty. If the target table already has records, the transfer task would fail due to conflicting row ID. Like I mentioned the row ID is autogenerated incrementally using a sequence. Is there a way to work around this

    – ddd
    Nov 14 '18 at 20:04












  • 1





    40GB doesn't seem like a lot of data, but take a look at Database Migration Service. It can do homogeneous migrations: aws.amazon.com/dms Not sure how easy it would be to filter rows by date though.

    – jarmod
    Nov 13 '18 at 22:41











  • @jarmod Tried Database Migration Service. It works pretty well to copy data from source table to target table that is empty. If the target table already has records, the transfer task would fail due to conflicting row ID. Like I mentioned the row ID is autogenerated incrementally using a sequence. Is there a way to work around this

    – ddd
    Nov 14 '18 at 20:04







1




1





40GB doesn't seem like a lot of data, but take a look at Database Migration Service. It can do homogeneous migrations: aws.amazon.com/dms Not sure how easy it would be to filter rows by date though.

– jarmod
Nov 13 '18 at 22:41





40GB doesn't seem like a lot of data, but take a look at Database Migration Service. It can do homogeneous migrations: aws.amazon.com/dms Not sure how easy it would be to filter rows by date though.

– jarmod
Nov 13 '18 at 22:41













@jarmod Tried Database Migration Service. It works pretty well to copy data from source table to target table that is empty. If the target table already has records, the transfer task would fail due to conflicting row ID. Like I mentioned the row ID is autogenerated incrementally using a sequence. Is there a way to work around this

– ddd
Nov 14 '18 at 20:04





@jarmod Tried Database Migration Service. It works pretty well to copy data from source table to target table that is empty. If the target table already has records, the transfer task would fail due to conflicting row ID. Like I mentioned the row ID is autogenerated incrementally using a sequence. Is there a way to work around this

– ddd
Nov 14 '18 at 20:04












0






active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53290412%2fhow-do-i-move-data-from-rds-of-one-aws-account-to-another-account%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53290412%2fhow-do-i-move-data-from-rds-of-one-aws-account-to-another-account%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Use pre created SQLite database for Android project in kotlin

Darth Vader #20

Ondo