I got a large dataset of all the addresses in my country (3.8GB). I am creating an API which will query the database for one specific address and respond with basic JSON data (300bytes). The API is running in Python on Azure Functions.
So far everything works great. When i do a single request i get a response time of +/- 100-150ms. Great! But...
If i try to load test the API with, let's say, 200 requests in 1 minute. The average response time is around 4-6 seconds.
This is what i tried so far;
- Connect the API to a SQL database
- Connect the API to a Cosmos DB database
- Smaller tables (less columns)
Is there some limit on the number of connections per database? The SQL Database or Cosmos DB doesn't seem to be the issue (% of CPU/Mem are good).
I created a simple '/status' endpoint without the DB connection on the API which can handle 200 requests in 1 minute easily.
Hopefully someone can push me in the right direction.