Many of us have been in “that” situation; the app is slowing to a crawl, and memory consumption is spiking through the roof. Is it a memory leak? Have the web gremlins finally broke through the firewall and found our system? So much time and energy is wasted trying to find out.
At Stackify, every millisecond spent processing a transaction counts, and the amount of memory each transaction takes is a large part of that equation. We use JSON for marshalling data across boundaries and storage. JSON serialization/deserialization can quickly contribute to poor performance and memory pressure if not handled correctly in high volume environments.
Here is an example:
In the code snippet below, we attempt to deserialize JSON contained in a string—fairly common practice in most applications.
What happens if we pass thousands of requests through this method, and some of those JSON strings are serialized large objects? If the JSON is greater than 85 KB, .NET will not allocate the memory to the usual generational heaps for garbage collection. Instead, the memory will be pushed to the Large Object Heap (LOH). As a result, more time is spent by the garbage collector attempting to cleanup out of scope objects, which will cause blocking and ultimately slow down your application.
Check this out
There is an easy solution, don’t allocate the entire JSON object into memory! Stream it instead.
Only a small buffer is allocated into memory and used to read chunks of the JSON as it attempts to create the target object. This technique avoids LOH memory allocation and reduces the GC time drastically resulting in improved application performance and scalability. See the difference below:
It is important to avoid using MemoryStreams since they’re simply just a streaming wrapper around a byte array. Byte arrays will be allocated as a contiguous block of memory and can still be allocated to the LOH if too big.
Principal Engineer / Stackify