Best way to increase performance with MongoDB for large number of recrods? #1240
-
We have the UI (same as your demo UI) and API deployed on our own Azure instance and we use MongoDB caching with a MongoDB azure instance database. The issue is, that when the UI does the carbon emission API calls, it takes ages to resolve (30~ seconds) for just ONE day of data. I.e these api calls: take forever to resolve. This seems to be because the code is doing this mongoDB call:
In our mongoDB instance we have 350,000 records stored in the database under the This is just for the past 3 months as well. So it seems that this is the reason it's extremely slow, it has to loop over 350,000 records. How can we make this faster? The only way I can think of is that we have our own database and API endpoints set up this new backend and database specifically for our UI and we pre-store the azure data in the database by date for each subscription before hand. Any other ideas on this? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hello @MartinDawson, thanks for raising this! Currently, it is looking like you are making the request for about a month. I am curious how you are able to tell that it is around 30 seconds per day. Also, you may want to check out in our configurations glossary the option to set the pagination number. This may help with the performance if you decrease it. Another suggestion is to use the Hope this helps! |
Beta Was this translation helpful? Give feedback.
Hello @MartinDawson, thanks for raising this!
Currently, it is looking like you are making the request for about a month. I am curious how you are able to tell that it is around 30 seconds per day.
Also, you may want to check out in our configurations glossary the option to set the pagination number. This may help with the performance if you decrease it.
Another suggestion is to use the
seed-cache-file
script. This will allow you to fetch data more as a background job, so when it comes time to load the UI, the data is already cached. You may also want to just to do this cache historical data, and implement a daily request system so they you never need to fetch more than a days worth of da…