Memory leak in parser? #4675
Unanswered
ArcherSmythe
asked this question in
Q&A
Replies: 1 comment 1 reply
-
I don't believe there is a memory leak: we execute the whole test suite with Valgrind and ASAN, see https://json.nlohmann.me/community/quality_assurance/#dynamic-analysis. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am loading a JSON file that is 150 MB long. I'm using this code for learning how to load the JSON, but there might be a memory leak?
Before reading the file, the application uses 6 MB of RAM, displayed at the first system call.
If I pass
y
as the argument to the program, the program loads the entire file into one string and passes it to the JSON parser. If I print the memory usage after that, I see the program uses 1020 MB of RAM.If I pass
n
as the argument to the program, the program loads the entire file into one string and passes it to the JSON parser. However, the resulting vector of structs returned by the parser doesn't get stored inexample
, so it just gets deleted. When I do this, the program uses 980 MB of RAM.Finally, if I don't pass any arguments to the program, the loop in the
else
block reads one element (line) at a time. There are 300 elements (lines) in the file. When I print the RAM usage after that loop, I see the program uses 85 MB.So, if I'm interpretting this correctly, the parser seems to retain something (up to 1000 MB of "something") after the call to
json::parse
returns. If I manually pass significantly shorter strings that state retained by the parser is an order of magnitude smaller. Is there some way to telljson
to clear any state after parsing?Thanks,
Archer
Beta Was this translation helpful? Give feedback.
All reactions