Moving from AWS to Render
How we moved a Rails application from AWS to Render, from code updates to DNS changes
We recently worked with a client who wanted to move their Ruby on Rails application from AWS to Render. In this post I will cover everything we did to start up on Render, move everything across, and turn off things on AWS.
As with most things technical, there is always a hefty amount of “it depends” when making decisions. What I describe in this post is just how we did it with the setup we had and the context we were working in. You might have a slightly different setup or context, but hopefully this will serve as a baseline.
Why move from AWS to Render
Anyone who has worked in startups will know the pain of being enticed onto AWS with free credits, only to realise that free isn’t really free.
AWS can be excellent value for money, if you have the team in place to properly set up and manage everything. Most companies don’t have access to such a team.
When your needs are straight forward, you don’t need something as complex as AWS, any number of hosting companies will do what you need.
For Ruby on Rails projects there are a few really good options. It just so happened that the client already had something on Render so the decision was fairly straight forward.
Our AWS setup
I wanted to share our AWS setup, if this is wildly different to your setup then this article might not be of much use to you.
- Ruby on Rails API connecting to a PostgreSQL database
- Elastic Container Services running EC2 instances
- EC2 Load Balancers to manage moving between EC2 instances
- CodeBuild and CodeDeploy to handle deployments
- Docker builds stored in Elastic Container Register
- RDS to store the PostgreSQL database
There was no Redis setup and we didn’t need any job server.
There was email and asset management which was being handled by AWS, which we decided is best kept on AWS, we won’t touch on these in this article since nothing has changed.
Create a new project in Render
First things first, in Render we needed to create a new project. A project is a grouping of similar services, so don’t worry if you don’t know the best name or if things will move later.
Within our new project we created two environments, staging and production. This replicated what we had on AWS.
Set up secrets
Render allows you to create environment groups, these let you have shared secrets between services.
I love this feature. It is handy for Rails projects when you want your web server and your jobs server to have mostly the same environment variables.
As it happens this project didn’t need a second type of server, but we set it up anyway because it is easier to do at the start than retrofit later if needs change.
Render lets you bulk upload environment values in the format
KEY=VALUE
KEY=VALUE
We got the secrets from AWS via the Elastic Container Service, selecting our service and going into our “Task Definitions” which has a JSON config text that was easy to copy and paste from.
Set up auto deploys
By hooking Render up to Github we could handle automatic deploys, which we want for both our environments.
When setting up Github integrations, always make sure to use a user who will be a part of the project for a long time, and document who has access so when access is removed you know where to update.
In this case, because we were a contractor and not a long-term member of staff, we worked with the team to get a login we could temporarily use to hook everything up.
Moving the database
We created a new PostgreSQL service on Render for both staging and production, the level you pick will depend on your project’s needs. It is good to know that whilst it is easy to upgrade your database, it isn’t so easy to downgrade. So if you are on the fence, pick the smaller option.
How you handle the database migration will very much depend on the criticality of your setup. The ideal is to have downtime where you can properly backup and restore the database to its new home.
We were fortunate that this project allowed for downtime so we could properly stop, backup, and restore the database.
The best mechanism for doing the data move is the one you’re most familiar with. I normally use pg_dump
and pg_restore
, for this specific project I was having an issue with those commands locally (homebrew was having a moment), so I ended up using the tooling built into PgAdmin 4.
We always recommend spot checking the newly restored database, looking at tables, indexes, and anything important to your project, before you hook it up to your Rails server.
Once happy with the database, we can take their internal URLs and replace the DATABASE_URL
environment variable with our new URL.
Set up pre-deploy command to run migrations
In the settings of both environments on Render, we set up a pre-deploy command to run rails db:migrate
. I know some folk prefer to run their migrations manually, I’ve never found that to be a benefit, especially not on this particular project.
Health check
We were able to add the health check endpoint for both staging & production
If Render detects the service is down for more than a certain time it will do a restart. This is something we had manually set up on AWS.
DNS
At this stage, we should have Rails running and connecting to our new database, we can test this using Render’s supplied URLs, but to do proper tests we should update DNS so that staging (and after, production) URLs move from pointing from AWS to pointing to Render.
To help speed things along, purged some public DNS caches.
Tidying up the codebase
There was nothing we needed to add to the codebase that was Render specific, however there was some AWS specific code in there we could remove.
- Removed Docker related setup and documentation
- CodeDeploy / CodeBuild related code and documentation
- README references to AWS
Tidying up AWS
Once everything is rolled out and tested on Render, it is time to tidy up AWS.
Here is a list of things we were able to delete, again this will completely depend on your setup.
- CodeDeploy setup
- CodeBuild setup
- Docker images in Elastic Container Register
- Databases on RDS
- Services on Elastic Container Service
- Load balancers on EC2
- Servers on EC2