I suspect this would cause the same performance issues as a long running application that constantly malloc'd and free'd memory? Many application runtimes allocate but don't free memory - they just reuse it internally - for this reason. For example in a ruby application you'll see memory usage climb after boot, and eventually level off when it has all it will need for its lifetime, but never go down.