Hi - my Visual C++ (Visual Studio 2019) MFC application (unmanaged code, not .net) produces reports of unlimited size - plus you can have multiple reports open at the same time. Reports can include an unlimited number of pictures. Users frequently create very large reports which can include large numbers of images, sometimes including full-page images (usually jpegs). My application uses DirectWrite and Direct2D for reports. For best performance, I cache images in memory where I can (hold onto ID2D1Bitmap objects). Looking at Task Manager, it seems (perhaps not surprisingly) that it is cached images that use a lot of memory. If I get a memory error, when handling images (e.g. when loading an image from a file), I free up memory throughout my application (release reference counts on ID2D1Bitmap objects) and try again. This strategy doesn't seem to work that well. For some reason, if I do this, I still tend to get memory errors after freeing the memory, whereas if I don't cache at all in the first place, things usually work better. I did debate calling EmptyWorkingSet after freeing up memory, but have heard that it's not advisable to do that...?
My application is 32-bit. In testing, memory errors mainly start to happen when memory usage hits about 1.4Gb.
Can anyone suggest a better strategy for managing memory - one that doesn't involve major changes to my code? Yes in time I will convert to 64-bit but I don't want to do that right now.
In principle I would like to stop caching long before I hit memory errors. Can anyone suggest a suitable mechanism/approach for doing that?