Memory error

I have developed a mobile application called Viktor, which has a feature to display hexagons in a specified area with a radius of 25 km. Each area contains a certain amount of hexagons but for some areas the amount of hexagons is so high that it causes a memory overflow error after releasing the app. I would like to not that this error is only caused after releasing the app since I think the memory is limited on the VIKTOR server. I am looking for a solution to deal with this problem but I have not been able to find one, I am thinking of finding a solution in the one of the following directions.

  1. Put the code which causes the error in a try except statement and catch the memory overflow error with the except statement. In this statement I would then clear the saved hexagons and reduce the search area and try again. However I am not sure if this will work since there is a memory overflow error and it might not be able to go into an except branch.
  2. Keep track of the memory used and when it gets dangerously high (98% or something) use the same except branch structure to reduce the search area and start again. I am not sure if it is possible to keep track of how much memory is used though.

Could anyone help me with this problem?

Hi Pim,

Welkome back to the community, thanks for posting
What you are trying to achieve is not completely clear. So if you could elaborate on your usecase or post some snippets. That would be nice and we can help you better.

Keeping track of memory on the fly is hard. Python has its own built in memory management called ‘garbage collection’. This means it keeps track if there is still an active reference to an object, otherwise the memory is released. Is it absolutely necessary to keep track of all the hexagons? You could also release memory by using the del command.

What format is are the hexagons? maybe you could reduce the memory footprint of the hexagons?

As a last resort we can increase the memory of the app, but in that case we need a good indication what the max memory usage is that is expected.

We are downloading polygons from a WFS service of Aerius. Depending on the area the user draws, this can result in a significant high number of polygons. Next, we have to process them and build a spatial index using Rtree to quickly spatially search the area.

I suppose this also comes at a cost? Also, I don’t know how we can figure out the memory usage.

You can profile the memory using the the package using this memory profiler package in your development environment.

install the package in your venv using:
pip install memory_profiler

Start VIKTOR with the profiler with:
mprof run viktor-cli.exe start

use the app as normal in the browser. Afterwards you can check the memory usage using:
mprof plot

If the plot gives an error, try to install PyQt5 in your venv.

pip install PyQt5

2 Likes

@mweehuizen Is it possible to use this profiler if you’re using Docker instead of a venv?

Hi @Floris,

I think this would not work in Docker because mprof run viktor-cli.exe start would start the docker container. The python script that is running the app is running inside the container. So you would need to run mprof from inside the container.

In te past we have done something different for the docker case.

When you run docker stats in a second WSL terminal, it shows ‘realtime’ memory usage of the container.

My collegue @regbers made the following WSL (linux) command to log the docker stats output into a .txt file.

while true; do printf "\n$(date +'%d_%b_%Y_%H_%M_%S'):\n" | tee --append stats_$now.txt;  docker stats --no-stream | tee --append stats_$now.txt; sleep 1; done

If you run this inside a seperate WSL terminal, it will log the memory usage of the app.

Hope this helps.