Memory pool

Memory pools, also called fixed-size blocks allocation, is the use of pools for memory management that allows dynamic memory allocation comparable to malloc or C++'s operator new. As those implementations suffer from fragmentation because of variable block sizes, it is not recommendable to use them in a real time system due to performance. A more efficient solution is preallocating a number of memory blocks with the same size called the memory pool. The application can allocate, access, and free blocks represented by handles at run time.

Many real-time operating systems use memory pools, such as the Transaction Processing Facility.

Some systems, like the web server Nginx, use the term memory pool to refer to a group of variable-size allocations which can be later deallocated all at once. This is also known as a region; see region-based memory management.

Simple memory pool implementation

A simple memory pool module can allocate, for example, three pools at compile time with block sizes optimized for the application deploying the module. The application can allocate, access and free memory through the following interface:

Memory pool vs malloc

Benefits

Drawbacks

See also

External links

This article is issued from Wikipedia - version of the Monday, February 08, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.