Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New version takes more cpu #125

Open
alexkno79 opened this issue Sep 6, 2024 · 10 comments · May be fixed by #128
Open

New version takes more cpu #125

alexkno79 opened this issue Sep 6, 2024 · 10 comments · May be fixed by #128

Comments

@alexkno79
Copy link

I am running ism7mqtt on a raspi in a docker environment.
I used previously v0.0.16 which used around 0.3% of my CPU in idle and around 2% in case new messages are received and processed.

When using the master branch I notice remarkable higher cpu consumption.

Below the same with v0.0.16:
IMG_0004

Below you can see some CPU stats using the master branch (same parameter.json and same overall config as all environment variables are the same, just the image changed):
IMG_0003

Also the overall usage is remarkable. Here a graph showing the docker stats for the container until 10:00 with master branch and the much more relaxed system after 10:00 running v0.0.16.

IMG_0006

So some of the changes seem to be very resource consuming in latest build for any reason

@tirregbo
Copy link

tirregbo commented Sep 7, 2024

On Debian Bullseye in a Proxmox-CT there are no differences in CPU-usage after switching from v0.0.16 to the latest release at around 9pm yesterday.

Screenshot 2024-09-07 100108

@alexkno79
Copy link
Author

alexkno79 commented Sep 7, 2024

Yes, maybe it is dependent on host system. I am running on a rpi4 (also bullseye ) and here I clearly have long term effect of nearly doubling CPU usage (while ram is lower).

Here full day yesterday and today and a clear drop in CPU usage as of 10am yesterday when I activated the v0.0.16 image.
Screenshot_20240907-102717

@zivillian
Copy link
Owner

The reason may be the switch from OpenSSL to Bouncy Castle. Before I spend time trying to optimize the performance I'd like to ask if this is an issue for you or just an observation?

@alexkno79
Copy link
Author

Thanks for your involvement!
In fact it's an issue. The rpi is a low ressource device and naturally ism7mqtt is running 24/7.

I noticed remarkably higher CPU temperatures and device being much warmer than usual at any time since using master branch and some 'on the edge' slowdowns when short term programs with high resources are running. This made me look for the reason which I found disappeared when going back to v0.0.16.

So, yes for me it is an issue which I currently overcome by using old version. But of course I'd like to be ready to use future versions as well, so I'd be happy to see this sorted out

@zivillian
Copy link
Owner

Can you try to generate two traces with dotnet trace on your hardware? One for v0.0.16 and another v0.0.17.

This won't work from outside the docker image and the ism7mqtt image will not allow you to spawn a shell, so you either need to temporarily run ism7mqtt without docker or create your own image.

@alexkno79
Copy link
Author

alexkno79 commented Sep 10, 2024

OK, I was running both then locally (I didnt manage to build Image with dotnet-trace successfully).

Both were running only a minute or two. If you need longer times, let me know

@zivillian zivillian linked a pull request Sep 10, 2024 that will close this issue
@zivillian
Copy link
Owner

zivillian commented Sep 10, 2024

@alexkno79 Thanks for the traces!

Unfortunately I need to inform you, that these traces contain the credentials for your ISM and your MQTT server. I've already removed the link from your comment, and opened a request on github support to delete the files, but the link was also sent via mail and still hasn't been deleted. So I would urge you to change the password. Sorry that I wasn't aware of this - please do not upload any traces to github in the future.

As for the actual problem: I can see that a significant amount of time is being spent in the new TLS layer and it looks like we should be able to fix it.
I've already raised an issue at bc-csharp and also created a workaround that should fix this specific issue. It would be great if you could test the binaries from #128 (can be found here)

@alexkno79
Copy link
Author

Thanks a Lot!
I tried but still CPU usage is nearly double as high as on v16.

I measure in 5min intervals and then new binaries take around 2.2% of my CPU while v16 consumes ca 1.3%

@zivillian
Copy link
Owner

I've pushed another change which revert to ssl connection to openssl - can you try the new binaries?

@alexkno79
Copy link
Author

Thanks again!
Unfortunately even the openssl version still has very similar CPU usage profile like the previous one.

Still ca 2% CPU usage compared to approx 1.2 on v0.0.16
It seems the reason is somewhere else.

Below the process figures where you can the switch back to v0.0.16 at around 21:37 which immediately drop to half the values

Screenshot_20240911-221544

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants