Bug in OpenSSL stream socket filter which causes memory leaks

There’s a bug in the OpenSSL stream socket filter which results in I/O buffers being leaked on both inbound and outbound data flow. This causes the memory used by the client or server to grow throughout the life of the process. The bug has been present since release 6.2 when the structure of the filtering code was redesigned.

Note that due to the fact that the filtering code was all redone at that time I expect that the same bug is likely to be present in the SChannel, SSPI and compression filters.

I’ve fixed the problem in the OpenSSL code and will be running this through my test harnesses over the next few days. If you want the fix now then please get in touch, if you can wait then I expect 6.3.1 will be released next week.

This shows that my pre release testing could still do with some work, automated unit tests and server stress tests can show that the code works but don’t, in themselves, necessarily catch everything and of course they’re only as good as the test coverage in any case. I discovered this issue whilst building some new performance tests which use perfmon to monitor the process under test and produce reports at the end. I guess I now have to extend the scope of this project so that I can monitor expected max and min values from the counters I’m monitoring - so I can see if the memory is ballooning during my automated tests…