I’m facing the issue “Out of memory: job could not be finished because too much memory was used.” while downloading an Excel file retrieving some general information from the entities. I’m actually in the dev version of my app with about 30 entities within the workspace. For the prod version, all the workspaces contain less entities and the download works well. Have you any ideas about the origins of the memory issue here? What is the memory limit that raises this error?
Thanks.
A memory issue could be because of a number of reasons. If an Excel file is created because of information from entities, I would assume that more entities => more data => more memory usage. A general first approach would be to check whether the code was written such that it wastes memory unnecessarily. This could be done by profiling memory, like mentioned in this post:
Every app has a set computational resources, which include disk space and memory (RAM). This is generally set to the default values provided. These can be adjusted through a request sent to us.
I deleted the majority of entities and kept only some. However, I got the same issue “Out of memory“ while trying to download an Excel file retrieving some general information from the entities. I published the code to 3 different apps, the prod, the staging and the dev ones. For the prod, the downloading is still working although the number of entities in each project (workspace) is quite large. For both staging and dev, the downloading is not working. I noticed that there are much more publication versions with staging and dev than with prod. Does the number the publication versions matter for the memory allowance? If it’s the case, how to delete the previous versions? In any cases, for requesting an increase of memory, how can I do?
The number of published versions doesn’t affect memory availability - each app runtime has its own isolated memory allocation (512 MB by default), regardless of publication history.
Since prod works but dev/staging don’t with the same code, this suggests the difference lies in the data or entity structure between environments, not the number of entities alone. A few entities with complex/large data can consume more memory than many simple ones.
To troubleshoot:
Use the memory profiler as mentioned earlier to identify which part of your Excel generation code consumes the most memory
Check if dev/staging entities contain larger files, more complex data structures, or different parametrization than prod
Look for potential memory leaks in your code - are you loading all entity data into memory at once? Consider processing entities in batches instead.
Requesting an increase can be done by sending me or @mslootweg a message, and simply asking for the memory on your specific environment and application to be bumped to a new limit. What that new limit should be is something you will have to determine, which is why I’m advising to get a sense for the needed memory first (by profiling).
As a good way to get into this: what data are you retrieving from entities and how are you building the Excel file?