Skip to content

Conversation

dunglas
Copy link
Owner

@dunglas dunglas commented Oct 6, 2025

Fix a bug introduced in #1090. The subscriber list cache size must have a limit to prevent a leak.

May fix #1024.

cc @naxo8628, this may dramatically reduce memory usage.

@naxo8628
Copy link

naxo8628 commented Oct 7, 2025

@dunglas deployed & monitoring

@naxo8628
Copy link

naxo8628 commented Oct 7, 2025

First 6 hours seems similiar as the last deploy PR1092:

Captura de pantalla 2025-10-07 a las 17 30 09

@dunglas
Copy link
Owner Author

dunglas commented Oct 7, 2025

Could you provide some profiles and maybe try to reduce the cache size using the new config parameter?

@naxo8628
Copy link

naxo8628 commented Oct 7, 2025

@naxo8628
Copy link

naxo8628 commented Oct 8, 2025

@dunglas
FYI:
12hs running fails on "Container mercure failed liveness probe, will be restarted"

settings:
topic_selector_cache 2000 128
subscriber_list_cache_size 10000

Captura de pantalla 2025-10-08 a las 11 37 09

redeployed now with lower cache settings:
topic_selector_cache 1000 128
subscriber_list_cache_size 1000

@naxo8628
Copy link

Last 24h :

config:
topic_selector_cache 1000 128
subscriber_list_cache_size 1000

Captura de pantalla 2025-10-10 a las 9 47 01

@dunglas
Copy link
Owner Author

dunglas commented Oct 10, 2025

@naxo8628 thanks. If I interpret correctly the graph, the problem looks fixed by this PR, right?

@naxo8628
Copy link

@dunglas yes. The memory/CPU is low and stable now! 😄

Last 48h:
Captura de pantalla 2025-10-11 a las 19 11 34

@dunglas dunglas merged commit 8d7c550 into main Oct 13, 2025
44 checks passed
@dunglas dunglas deleted the fix/cache-size branch October 13, 2025 12:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

High memory usage with the latest version

2 participants