{"id":260,"date":"2024-08-29T15:41:20","date_gmt":"2024-08-29T14:41:20","guid":{"rendered":"https:\/\/livingdevops.com\/?p=260"},"modified":"2025-07-05T10:36:27","modified_gmt":"2025-07-05T09:36:27","slug":"migrating-rds-postgres-database-with-python","status":"publish","type":"post","link":"https:\/\/livingdevops.com\/devops\/migrating-rds-postgres-database-with-python\/","title":{"rendered":"A complete guide to migrating the RDS Postgres database with\u00a0Python"},"content":{"rendered":"\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<pre class=\"wp-block-preformatted\">Python Automation That Saved Our Client $1000\/Month On Cloud&nbsp;Bills<\/pre>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/0*kjKYMcIvRhdMz4Yd\" alt=\"\"\/><figcaption class=\"wp-element-caption\">Photo by <a href=\"https:\/\/unsplash.com\/@sharonmccutcheon?utm_source=medium&amp;utm_medium=referral\" rel=\"noreferrer noopener\" target=\"_blank\">Alexander Grey<\/a> on&nbsp;<a href=\"https:\/\/unsplash.com?utm_source=medium&amp;utm_medium=referral\" rel=\"noreferrer noopener\" target=\"_blank\">Unsplash<\/a><\/figcaption><\/figure>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>I was working on a project to reduce cloud costs across multiple AWS accounts. We discovered many RDS Postgres databases were over-provisioned, wasting terabytes of storage.<\/p>\n<\/blockquote>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>AWS doesn\u2019t let you simply reduce allocated RDS storage. You can\u2019t even restore from a snapshot with less space. To fix this, you need to create a new RDS instance with less storage and restore the database there. It\u2019s not straightforward, but it was necessary for over 100 databases.<\/p>\n<\/blockquote>\n\n\n\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-2356534591331561\"\n     crossorigin=\"anonymous\"><\/script>\n<ins class=\"adsbygoogle\"\n     style=\"display:block; text-align:center;\"\n     data-ad-layout=\"in-article\"\n     data-ad-format=\"fluid\"\n     data-ad-client=\"ca-pub-2356534591331561\"\n     data-ad-slot=\"3134096072\"><\/ins>\n<script>\n     (adsbygoogle = window.adsbygoogle || []).push({});\n<\/script>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>To solve this, I wrote a Python script to automate the process. I packaged it as a Docker container, deployed it to ECS, and used a Lambda function to trigger it. This approach allowed us to efficiently right-size our databases and significantly cut down on unnecessary cloud spending.<\/p>\n<\/blockquote>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\">Automating the AWS RDS Postgres database migration with&nbsp;boto3<\/h3>\n\n\n\n<p>Let me show you how to implement a similar solution in your environment. I will use AWS Python SDK boto3 to implement the automation.<\/p>\n\n\n\n<p>As part of the database migration, I will follow the below steps.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Use boto3 to pull the RDS instance detail and its storage utilization.<\/li>\n\n\n\n<li>Use the Postgres utility pg_dump to take the backup of the database.<\/li>\n\n\n\n<li>Duplicate the RDS instance with a new storage size, and name it with <code>-new<\/code> suffix.<\/li>\n\n\n\n<li>Wait for the new database to be available.<\/li>\n\n\n\n<li>Restore the new database from the database dump taken in the previous step with the pg_restore utility.<\/li>\n\n\n\n<li>Swap the original database and the duplicated one by renaming them.<\/li>\n\n\n\n<li>Stop the old DB instance and manually delete it after a few days.<\/li>\n<\/ul>\n\n\n\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-2356534591331561\"\n     crossorigin=\"anonymous\"><\/script>\n<ins class=\"adsbygoogle\"\n     style=\"display:block; text-align:center;\"\n     data-ad-layout=\"in-article\"\n     data-ad-format=\"fluid\"\n     data-ad-client=\"ca-pub-2356534591331561\"\n     data-ad-slot=\"3134096072\"><\/ins>\n<script>\n     (adsbygoogle = window.adsbygoogle || []).push({});\n<\/script>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\">Generating a database usage&nbsp;report<\/h3>\n\n\n\n<p>I have created a script and placed it in my <a href=\"https:\/\/github.com\/akhileshmishrabiz\/python-for-devops\/blob\/main\/python-automations-projects\/aws-rds-postgres-migration\/rds_storage_report.py\" rel=\"noreferrer noopener\" target=\"_blank\">GitHub repo<\/a>. Let me create three Postgres database instances in my AWS account by running AWS CLI commands.<\/p>\n\n\n\n<p>I will use an AWS EC2 Linux VM to run scripts for this blog post.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code># Install the AWS CLI on a Linux machine, follow these steps<br>curl \"https:\/\/awscli.amazonaws.com\/awscli-exe-linux-x86_64.zip\" -o \"awscliv2.zip\"<br>unzip awscliv2.zip<br>sudo .\/aws\/install<\/code><\/pre>\n\n\n\n<pre class=\"wp-block-code\"><code># Create 3 RDS postgres instances.\naws rds create-db-instance \\\n--db-instance-identifier mydbinstance-1 \\\n--db-instance-class db.t3.micro \\\n--engine postgres \\\n--allocated-storage 50 \\\n--master-username mymasteruser \\\n--master-user-password mypassword \\\n--backup-retention-period 7\n\naws rds create-db-instance \\\n--db-instance-identifier mydbinstance-2 \\\n--db-instance-class db.t3.micro \\\n--engine postgres \\\n--allocated-storage 70 \\\n--master-username mymasteruser \\\n--master-user-password mypassword \\\n--backup-retention-period 7\n\naws rds create-db-instance \\\n--db-instance-identifier mydbinstance-3 \\\n--db-instance-class db.t3.micro \\\n--engine postgres \\\n--allocated-storage 60 \\\n--master-username mymasteruser \\\n--master-user-password mypassword \\\n--backup-retention-period 7<\/code><\/pre>\n\n\n\n<p>Wait for them to come up and then run the script.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>git clone https:\/\/github.com\/akhileshmishrabiz\/python-for-devops\ncd python-automations-projects\/aws-rds-postgres-migration\/\npip install -r requitements.txt\n\n# Run the script\npython3 rds_storage_report.py<\/code><\/pre>\n\n\n\n<p>This will generate an Excel report, It will look something like this.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*t5B9O_nr8aViCzQr2CozTw.png\" alt=\"\"\/><\/figure>\n\n\n\n<p>This report shows the used and allocated disk space, and it will help you decide if you can save some money by cutting down the storage.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Cleanup<\/h4>\n\n\n\n<pre class=\"wp-block-code\"><code>aws rds delete-db-instance \\\n--db-instance-identifier mydbinstance-1 \\\n--skip-final-snapshot\n\naws rds delete-db-instance \\\n--db-instance-identifier mydbinstance-2 \\\n--skip-final-snapshot\n\naws rds delete-db-instance \\\n--db-instance-identifier mydbinstance-3 \\\n--skip-final-snapshot<\/code><\/pre>\n\n\n\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-2356534591331561\"\n     crossorigin=\"anonymous\"><\/script>\n<ins class=\"adsbygoogle\"\n     style=\"display:block; text-align:center;\"\n     data-ad-layout=\"in-article\"\n     data-ad-format=\"fluid\"\n     data-ad-client=\"ca-pub-2356534591331561\"\n     data-ad-slot=\"3134096072\"><\/ins>\n<script>\n     (adsbygoogle = window.adsbygoogle || []).push({});\n<\/script>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\">Migrating Postgres database to reduce cloud&nbsp;cost<\/h3>\n\n\n\n<p>I created the RDS Postgres database instance in the last blog post, <a href=\"https:\/\/medium.com\/aws-tip\/devops-zero-to-hero-7-deploying-containers-with-aws-ecs-510d1f78fb21\" target=\"_blank\" rel=\"noreferrer noopener\">Deploying Containers with AWS ECS.<\/a> I will use the same RDS instant for this blog.<\/p>\n\n\n\n<p>I used the parameter store parameter to store the DB link for all databases.<strong> I use the <\/strong><code><strong>\/database\/&lt;DB instance ID&gt;<\/strong><\/code> format to store the DB link parameter. DB links can be used to access the database. I will create the DB link using the database details below.<\/p>\n\n\n\n<p><code><strong>postgresql:\/\/{db_user}:{db_password}@{db_host_endpoint}:{db_port}\/{database_name}<\/strong><\/code><\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p>Let\u2019s start building the automation, one function at a time. I will explain each step<\/p>\n\n\n\n<p>Note: <a href=\"https:\/\/github.com\/akhileshmishrabiz\/python-for-devops\/tree\/main\/python-automations-projects\/aws-rds-postgres-migration\/migrtate-with-pg-dump-restore\" rel=\"noreferrer noopener\" target=\"_blank\">I have placed the code used in this blog post in my GitHub repo <\/a><code><a href=\"https:\/\/github.com\/akhileshmishrabiz\/python-for-devops\/tree\/main\/python-automations-projects\/aws-rds-postgres-migration\/migrtate-with-pg-dump-restore\" rel=\"noreferrer noopener\" target=\"_blank\">python-for-devops.<\/a><\/code><\/p>\n\n\n\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-2356534591331561\"\n     crossorigin=\"anonymous\"><\/script>\n<ins class=\"adsbygoogle\"\n     style=\"display:block; text-align:center;\"\n     data-ad-layout=\"in-article\"\n     data-ad-format=\"fluid\"\n     data-ad-client=\"ca-pub-2356534591331561\"\n     data-ad-slot=\"3134096072\"><\/ins>\n<script>\n     (adsbygoogle = window.adsbygoogle || []).push({});\n<\/script>\n\n\n\n<h4 class=\"wp-block-heading\">1. Get RDS details of the original DB&nbsp;instance<\/h4>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*3whf50kougqWbgE57T06iw.png\" alt=\"\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">2. Get the storage utilization for the DB from CloudWatch monitoring<\/h4>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*j7lu7ulSTCfqojfpqwLkNw.png\" alt=\"\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">3. Compare the allocated and free disk&nbsp;space<\/h4>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*GamCWKfNfekF58TG3idXJg.png\" alt=\"\"\/><\/figure>\n\n\n\n<p>Since we had stored the DB user, password, etc on a parameter store, we will pull that data.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">4. DB details from the parameter store<\/h4>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*WmkmJS3nYPYDXOa2z-Kvtg.png\" alt=\"\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">5. Duplicate the RDS instance with the new storage&nbsp;value<\/h4>\n\n\n\n<p>This function will duplicate the RDS instance with the new value for allocated disk space. The rest of the values will be the same as the original RDS instance.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p><em>Note: If you are using gp2 disks and want to migrate to gp3, you can update that value in the <\/em><code><em>StorageType<\/em><\/code><em> key.<\/em><\/p>\n<\/blockquote>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*AJ3AjtX5lX9d2kd10bshnQ.png\" alt=\"\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">6. Check if RDS is&nbsp;ready<\/h4>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*1maH6XsoGhsYErjor_XnJg.png\" alt=\"\"\/><\/figure>\n\n\n\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-2356534591331561\"\n     crossorigin=\"anonymous\"><\/script>\n<ins class=\"adsbygoogle\"\n     style=\"display:block; text-align:center;\"\n     data-ad-layout=\"in-article\"\n     data-ad-format=\"fluid\"\n     data-ad-client=\"ca-pub-2356534591331561\"\n     data-ad-slot=\"3134096072\"><\/ins>\n<script>\n     (adsbygoogle = window.adsbygoogle || []).push({});\n<\/script>\n\n\n\n<h4 class=\"wp-block-heading\">7. Take the backup of the Postgres database with the pg_dump&nbsp;utility.<\/h4>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*sydYi8-S_sRg_QIhhinqHQ.png\" alt=\"\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">8. Restore the new RDS instance from the backup we took in the last&nbsp;step.<\/h4>\n\n\n\n<p><em>In this example, I used the backup from local storage to restore.<\/em> <em>You can add additional logic to upload the backup to an S3 bucket after taking a backup and downloading the backup from S3 before restoring if you have a special requirement.<\/em><\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*mOcr0YRTzpR8HnvZXlBQAw.png\" alt=\"\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">9. Rename the RDS&nbsp;instance<\/h4>\n\n\n\n<p>This function will rename the database. I used the sleep from time module to wait instead of <a href=\"https:\/\/github.com\/boto\/boto3\/issues\/609\" rel=\"noreferrer noopener\" target=\"_blank\">boto3 waiter as it is broken<\/a>.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*hsspdb5upiChKPicucKWzQ.png\" alt=\"\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">10. Swap the RDS instance.<\/h4>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*1Jbl7ZWsiwTvHQ7JkZw7Hw.png\" alt=\"\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">11. Stop the RDS&nbsp;instance<\/h4>\n\n\n\n<p>This function will stop the original RDS instance and we will use the new one.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*YZcV9P1E4PaurePhUFKyZw.png\" alt=\"\"\/><\/figure>\n\n\n\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-2356534591331561\"\n     crossorigin=\"anonymous\"><\/script>\n<ins class=\"adsbygoogle\"\n     style=\"display:block; text-align:center;\"\n     data-ad-layout=\"in-article\"\n     data-ad-format=\"fluid\"\n     data-ad-client=\"ca-pub-2356534591331561\"\n     data-ad-slot=\"3134096072\"><\/ins>\n<script>\n     (adsbygoogle = window.adsbygoogle || []).push({});\n<\/script>\n\n\n\n<h3 class=\"wp-block-heading\">We call all the above functions to migrate the Postgres RDS instance.<\/h3>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*1eDWZPbiwutYt8XryVELNg.png\" alt=\"\"\/><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\">Running the script from the EC2&nbsp;machine<\/h3>\n\n\n\n<p>Since I used an RDS instance with a private endpoint( i.e. no public IP to access from outside the network).<\/p>\n\n\n\n<p>I launched an EC2 Ubuntu machine on the same VPC where my RDS instance exists. As my script will be using <code>pg_dump<\/code> and <code>pg_restore<\/code> commands for backup\/restore, I will also need to install Postgres and other dependencies.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\">Installing dependencies on the Ubuntu&nbsp;machine.<\/h4>\n\n\n\n<pre class=\"wp-block-code\"><code>sudo apt-get update<br>sudo apt-get install wget ca-certificates<br>wget --quiet -O - https:\/\/www.postgresql.org\/media\/keys\/ACCC4CF8.asc | sudo apt-key add -<br>sudo sh -c 'echo \"deb http:\/\/apt.postgresql.org\/pub\/repos\/apt $(lsb_release -cs)-pgdg main\" &gt; \/etc\/apt\/sources.list.d\/pgdg.list'sudo apt-get update<br>sudo apt-get install postgresql-client-16<br>pg_dump --version<br>psql --version# install pip<br>apt install python3-pip# install aws cli<br>sudo apt install awscli -y<\/code><\/pre>\n\n\n\n<ul class=\"wp-block-list\">\n<li>The security group attached to the Ubuntu machine allows inbound\/outbound connection to RDS on the DB port(5432).<\/li>\n\n\n\n<li>The security group attached to RDS allows inbound connection on DB port(5432).<\/li>\n\n\n\n<li>The IAM role attached to EC2 allows relevant permissions. You can attach the admin role if you are testing it.<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Instructions to run the&nbsp;script<\/h3>\n\n\n\n<h4 class=\"wp-block-heading\">Copy the script to your&nbsp;machine.<\/h4>\n\n\n\n<pre class=\"wp-block-code\"><code>git clone https:\/\/github.com\/akhileshmishrabiz\/python-for-devops<br>cd python-automations-projects\/aws-rds-postgres-migration\/migrtate-with-pg-dump-restore<br>ls<\/code><\/pre>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*3DiZE2nCl_kF0idwJmCvGw.png\" alt=\"\"\/><\/figure>\n\n\n\n<h4 class=\"wp-block-heading\">Install python dependencies<\/h4>\n\n\n\n<pre class=\"wp-block-code\"><code>pip install -r requirements.txt<\/code><\/pre>\n\n\n\n<h4 class=\"wp-block-heading\">Run the&nbsp;script<\/h4>\n\n\n\n<pre class=\"wp-block-code\"><code># Evaluate the status of 'my-rds-instance-name'<br>python main.py evaluate &lt;my-rds-instance-name<br># Migrate 'my-rds-instance-name' with a new storage size(in GB)<br>python main.py migrate my-rds-instance-name &lt; Allocated Storage&gt;<\/code><\/pre>\n\n\n\n<script async src=\"https:\/\/pagead2.googlesyndication.com\/pagead\/js\/adsbygoogle.js?client=ca-pub-2356534591331561\"\n     crossorigin=\"anonymous\"><\/script>\n<ins class=\"adsbygoogle\"\n     style=\"display:block; text-align:center;\"\n     data-ad-layout=\"in-article\"\n     data-ad-format=\"fluid\"\n     data-ad-client=\"ca-pub-2356534591331561\"\n     data-ad-slot=\"3134096072\"><\/ins>\n<script>\n     (adsbygoogle = window.adsbygoogle || []).push({});\n<\/script>\n\n\n\n<h3 class=\"wp-block-heading\">Final results<\/h3>\n\n\n\n<p>Calculating the storage usage of the RDS instance.<\/p>\n\n\n\n<p>You can get the complete script from my Github Repo.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>python3 main.py evaluate wordpress<\/code><\/pre>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*r2fL7ayDT1xoCow25-X_Fg.png\" alt=\"\"\/><\/figure>\n\n\n\n<p>As you can see most of the allocated storage is not used for this instance. I will reduce the RDS instance to half its size, 10 GB.<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>python3 main.py migrate wordpress 10<\/code><\/pre>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*tI75DcgnqcrhqIbku5pxFw.png\" alt=\"\"\/><\/figure>\n\n\n\n<p>Here is the RDS console output<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*tVEL5f97FJhXk7-dL1zsww.png\" alt=\"\"\/><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/cdn-images-1.medium.com\/max\/1000\/1*3Dd97lPljtN1IqFz2TbDwA.png\" alt=\"\"\/><\/figure>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p><a href=\"https:\/\/github.com\/akhileshmishrabiz\/python-for-devops\/tree\/main\/python-automations-projects\/aws-rds-postgres-migration\/migrtate-with-pg-dump-restore\" rel=\"noreferrer noopener\" target=\"_blank\">I have placed the complete code in my public GitHub repo <\/a><code><a href=\"https:\/\/github.com\/akhileshmishrabiz\/python-for-devops\/tree\/main\/python-automations-projects\/aws-rds-postgres-migration\/migrtate-with-pg-dump-restore\" rel=\"noreferrer noopener\" target=\"_blank\">python-for-devops.<\/a><\/code><\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Pitfalls<\/h3>\n\n\n\n<p>While testing my database migration automation I found that <code>Pg_dump<\/code> and <code>Pg_restore<\/code> might take significantly longer for large database migrations. My lead suggested I use, a great tool for live migration for Postgres databases. It was faster as it did not do backup\/restore, instead it directly synced both databases.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Final words<\/h3>\n\n\n\n<p>I started with <code>pg_dump\/pg_restore<\/code> but ended up not using it for my use case as this approach was slow (for larger databases) and required downtime. I used <a href=\"https:\/\/livingdevops.com\/devops\/migrate-postgres-database-with-zero-downtime-using-pgsync\/\">pgsync<\/a>, a great tool for live database migrations.<\/p>\n\n\n\n<p>I packed the script as a Docker image and used ECS to deploy the solution and Lambda to trigger the migration. I will write about it in my future blog<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<h3 class=\"wp-block-heading\">About me<\/h3>\n\n\n\n<p>I am Akhilesh Mishra, a self-taught Devops engineer with 11+ years working on private and public cloud (GCP &amp; AWS)technologies.<\/p>\n\n\n\n<p>I also mentor DevOps aspirants in their journey to devops by providing guided learning and Mentorship.<\/p>\n\n\n\n<p><a href=\"https:\/\/topmate.io\/akhilesh_mishra\/\" target=\"_blank\" rel=\"noopener\">Mentorship with Akhilesh on Topmate<\/a><br><a href=\"https:\/\/preplaced.in\/profile\/akhilesh-mishra\" target=\"_blank\" rel=\"noopener\">Longterm mentorship with Akhilesh on Preplaced<\/a><\/p>\n\n\n\n<p>Connect with me on Linkedin: <a href=\"https:\/\/www.linkedin.com\/in\/akhilesh-mishra-0ab886124\/\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/www.linkedin.com\/in\/akhilesh-mishra-0ab886124\/<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Python Automation That Saved Our Client $1000\/Month On Cloud&nbsp;Bills I was working on a project to reduce cloud costs across multiple AWS accounts. We discovered many RDS Postgres databases were over-provisioned, wasting terabytes of storage. AWS doesn\u2019t let you simply reduce allocated RDS storage. You can\u2019t even restore from a snapshot with less space. To [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":261,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4,7,12],"tags":[],"class_list":["post-260","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-devops","category-python","category-tutorials"],"blocksy_meta":[],"_links":{"self":[{"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/posts\/260","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/comments?post=260"}],"version-history":[{"count":5,"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/posts\/260\/revisions"}],"predecessor-version":[{"id":1016,"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/posts\/260\/revisions\/1016"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/media\/261"}],"wp:attachment":[{"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/media?parent=260"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/categories?post=260"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/livingdevops.com\/wp-json\/wp\/v2\/tags?post=260"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}