Skip to content

optimize memory usage when creating log.html? #4739

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
oboehmer opened this issue Apr 17, 2023 · 6 comments
Closed

optimize memory usage when creating log.html? #4739

oboehmer opened this issue Apr 17, 2023 · 6 comments

Comments

@oboehmer
Copy link
Contributor

I noticed robot/rebot can use multiple Gigabytes of memory when rendering log.html from complex/long output.xml. We noticed this in a container environment where we cap memory consumption to 4GB, and noticed some robot suite execution being killed by the OS. The output.xml in question contained more than 1.8 million keyword executions!!!).

I have no idea on the feasibility/effort of this ask, but it would be great if we could explore options to reduce memory footprint for this task.. We have already recommended keyword flattening to our user base

It is not a high-priority item, but would be good to review optimization areas when it comes to memory footprint..

@pekkaklarck
Copy link
Member

First of all, have you tested using --splitlog? It ought to make loading the log file a lot faster, but opening individual tests and keywords should still work well.

The log.html is highly optimized in terms of file size, but I'm certain there would be room for memory and performance optimization. The problem is that we are planning to rewrite the whole thing in the somewhat near future (#4304) and trying to enhance the current log file doesn't make much sense. If someone has knowledge about profiling this kind of JavaScript and HMTL code, it would be great if they could take a quick look are there some easy wins. Anyone interested? If not, I believe it's best to close this issue and concentrate on the new tech instead.

@oboehmer
Copy link
Contributor Author

Thanks, @pekkaklarck
Just to be sure: this issue is not about loading the logfile (this is fast). It is robot/rebot creating it which takes that much memory.

@pekkaklarck
Copy link
Member

I see. I thought "rendering" referred to viewing the lig file in a browser.

Creating the log file has been profiled quite a bit and I'm afraid there are no easy wins. That said, would you be interested to profile where memory goes in your exact case? I've used https://pypi.org/project/filprofiler few times and it has worked great.

@oboehmer oboehmer changed the title optimize memory usage when rendering log.html? optimize memory usage when creating log.html? Apr 21, 2023
@oboehmer
Copy link
Contributor Author

Thanks, let me share the results unicast via slack.. rebot --log ... output.xml showed peak memory usage of 3.8 GB..

@pekkaklarck
Copy link
Member

Have you @oboehmer investigated this further? Just the peak memory usage doesn't tell much, we needed to know where the memory is spend. The Fil profiler I mentioned above could help with that.

Typically huge memory usage is related to having log of keywords, loop iterations, or other such constructs in output.xml. Handling them can be optimized in Robot, but typically a better solution is having less constructs like that. This can mean moving logic from resource files to libraries, making sure looping controls (e.g. WHILE) don't loop unnecessarily iterations (or moving the to libraries), and so on. On Robot data level something that can work really well is "flattening" using either --flattenkeywords or the newer robot:flatten tag. The latter is more powerful because it flattens content already during execution.

@pekkaklarck
Copy link
Member

We have a newer issue about profiling memory usage (#5371). I'll close this as its duplicate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants