This time we have a real example of a B2B app with a massive DAU (Daily Active Users) 1TB of production database and performance issues. Users were complaining about speed and response time for the app. Delays were blocking their daily workflow, and some of them went to competitors.
The app is built using RoR on Heroku.
After fine-tuning Heroku Dynos for concurrency and web and background jobs performance, we had met another bottleneck – database performance.
Heroku 500 connections (lions) limit
The most important limitation for the Heroku PostgreSQL database is a connection limit of no more than 500 connections. In real life, we have less than this because of superuser_reserved_connections setting.
Whenever the number of active concurrent connections is at least max_connections minus superuser_reserved_connections, new connections will be accepted only for superusers, and no new replication connections will be accepted.
To sum up: No matter how many dynos we created – when the app reached Heroku max DB connections limit set to 500, we had a user with an issue. Consider that all of your background jobs also count in that limit.
Make Hard Decisions Bravely
Because we can’t change the connections limit in any of Heroku Postgresql database plans, we decided to migrate the database to AWS RDS and tune it to handle massive amounts of connections.
Heroku and AWS
On Heroku, everything is done for you, however with some configurations that you can’t change. Please keep in mind that Heroku is built on top of AWS services.
AWS is Infrastructure as a Service (IaaS), as opposed to Heroku PaaS, and requires more know-how to set up and run. Amazon Web Services provides a plenty of services for our applications with freedom of your own configuration.
AWS EC2 is a flexible IaaS service. Still, before you can deploy your app on EC2, you need to create a server infrastructure that fits your project. Put simply, your team needs to manually set up and support virtual servers that launch the app. Plus, add database instances, choose and configure an OS.
If you don’t have enough time or knowledge to do full migration to AWS, you can always migrate there to your database server. Unfortunately, there is no UI to do it by one click; however, it’s not so complicated.
How to migrate a database from Heroku to AWS
Tests on a staging environment and production dump show us that the app works great on an AWS RDS instance.
In our B2B app, most users work in the US time zones, so we choose one of their Friday nights to make migration for the production database.
Steps did during migration:
- Shut down all Heroku web dynos;
- Wait for all background jobs to finish and stop background dynos;
- Make a full database dump and save it on S3
- Load dump from S3 to RDS instance
- Change DB host on instances to an RDS
- Run app
- Do post-deploy tests
Load tests prove that the app with RDS Postgresql can work fast even with 5000 database connections 💪 we even don’t need provisioned SSD.
Our Postgresql migration price and options comparison:
|Heroku Postgresql Premium 5||AWS RDS for Postgresql|
|RAM||61GB||64GB (16 vCPUs)|
|Storage||1TB (plan)||1TB (custom)|
|Max connections||500*||custom (6–8388607)|
|Price / month||2500 $||1157 $|
* max_connections minus superuser_reserved_connections so real value will be smaller by ~15. In our case real limit is 485.
At the bottom line, we have not only removed the most significant bottleneck, although we also made savings in 16116 $ annual :)
Bonus – how we dealt with the unfamous Heroku 30 seconds limit
How to deal with the Heroku limit on a single web request duration to 30 seconds for your uploads?
We can do a fast fix and implement direct S3 uploads for your web and mobile apps to not relay on your web dynos 😎.