Processing in Batch slows down permamently
Hi Folks,
I have the problem that if I measure more than 10-20 large images, the batch processor becomes slower and slower.
RAM gets filled, and when I am at 10 GB (from 16) the process is so slow that it is hard to stop at all.
HW settings (advanced options) do not seem to change anything on that. Any idea what is the reason?
Do you have a ready to use batch process function or macro code - would that resolve the speed problem?
Thanks
Daniel
I have the problem that if I measure more than 10-20 large images, the batch processor becomes slower and slower.
RAM gets filled, and when I am at 10 GB (from 16) the process is so slow that it is hard to stop at all.
HW settings (advanced options) do not seem to change anything on that. Any idea what is the reason?
Do you have a ready to use batch process function or macro code - would that resolve the speed problem?
Thanks
Daniel
0
Best Answers
-
Hi Daniel,
I ran 1-image batch multiple times. You can also try that to see if you have the same times. If not, then you should check the macro. The only difference is that when you run 1-image batch, the PresetMeasures macro always runs before the MeasureAll. When you run multi-image batch PresetMeasures runs once and then multiple times MeasureAll. The PresetMeasures macro setups multiple variables and if MeasureAll changes any of then, the next image processing will be not the same as the previous. I haven't debugged it, but it will be worth looking at if you will see the difference in running 1-image and multi-image batches.
Yuri0 -
Hi Daniel,
Yes it's possible. By default commands are kept in the task manager while it's open which depending on the commands may end up using memory. Other than closing the task manager, one way to address that is to change the task scheduling settings to auto-delete finished commands and only keep a few around, this is done in the Options dialog that you access from the File menu.
Pierre
0
Answers
There could be multiple reasons for that depending on the context of your batch macro.
1. Images may not be released (check that you release all your image references)
2. Data Collector table becomes too large. (if you collect a lot of objects from every image).
...
Can you post you batch macro?
Yuri
Attached the whole code - presets run before batch, measureSmart as batch.
Don't worry about specialties - gets also slower with just the basic measurements (3 RIOs out of 6).
NO ROI saving, no data collection, closing all images after processing etc...
Don't really know why RAM is wasted.
What do you exactly mean with "release images"?
The doc file is good, but it would be easier to debug if you can provide the coplete project.
Can you please attach the IPX file? (Package the project ). Can you also show what Batch parameters you use? What is the typical size of your images?
Thanks,
Yuri
Attached as ipx.
Image Size is around 200MB (RBG due to wrong conversion of Zen software thus 8bit conv and calib).
Batch parameter - Show TM and display Docs ON, the rest OFF.
I'm not sure if Display OFF would work.
Macro is now a bit different than before, since I currently work on it. Try to get data from classified count and don't check how...
However, for the basic counts all those options are off (dec... all false)
Thx
Dan
I checked your project. I couldn't run it properly, because I miss some additional files, such as calibration, options and ROI files, that are used in the macro, but if I ignore these function the batch macro runs properly without any leaks of images.
All images are closed in the end of every image processing (by MeasureAll).
The problem could be that if any of the macros in MeasureAll fail (e.g. ROI file is missing), the batch macro will exit without calling the last CloseAll command and the images will be accumulated in the application causing eventual slow down when no more RAM is available. Please debug the macro (using break points) to be sure that all macro steps, including the CloseAll are executed.
We also have a special "Debugging" project, which is installed by default to the Scrips folder. In the Leaks folder of this project you will find "CountImages" macro, which will report all visible and invisible images opened in Premier. You can run this macro after executing the MeasureAll macro (or the complete Batch) to check if there are any images stay in memory. You can also run "GarbageCollect" macro that will try to release all .NET memory allocations before running "CountImages".
A couple of notes on macro writing: I noticed that you call macros using PlayMacro function
You can also call macros from the same module directly by name:
'call macro
ROILoad_and_Measure
or from CodeCommand:
With Automate.ScriptingCommands.CodeCommand(MeasureAll) If .Run() Then ROILoad_and_Measure End If End With
Let me know your debugging results.
Regards,
Yuri
I did run the ImageCount with the result no images in collection,
and I inserted GarbageCollect without much result (same speed). I also tried inserting Waits e.g. after closing so that nothing would remain behind, but with no effect either. Probably it's just my system that makes it? I'm running 8.1 Pro on an Alienware 17, i7 core with 16 GB Ram. What do you think?
Your computer spec looks good and if the ImageCount shows no images, then all images are released properly. Do you see memory build up during batch processing? If yes, then the memory might be allocated (and not released) by something else, e.g. data arrays.
Debugging will be needed to check all the macro steps. I can look at your project, but I will need all the files you use in the batch macro (calibrations, option files, ROIs,...) and a couple of test images. If you can provide all this info, please Click here to upload the files.
Regards,
Yuri
I packed the current modules into ipx, added the files that are suiting to the current settings in Function PresetsMeasure and uploaded two (originally there are 4 separate channel images) images. The ROI file is already there, the one of the second set contains an erroneous ROI (none does not matter - then images are closed in next round). I set to two ROIs that are measured per image (one round less but enough) and added a variable in PresetsMeasure, in which you can enter the paths for the location of the iqo and now also the csv data and the iqc files.
I now chose "D:\Temp\" for all, so if you put all at this location it will be fine.
I now measure with classification, but for your debugging turned it off (decClassify = false), since it is not relevant for speed loss (such as all other specialties like BG correction or mask saving etc...).
In batch processor PresetsMeasure is Run Before, and then Loop on MeasureAll.
What you need to do to see the slowdown is to have some more images, so just copy and paste the same images into the same folder - this should work (otherwise huge upload). Latest after 10 images you should see significant slow down, if it's not due to my setup, after 30 it's getting really slow.
Note that I search for *DAPI*.tif in the folder spec. of the batch processor.
Last but not least I added a powerpoint with screenshots of other settings that I have - maybe there is something to improve.
Many thanks for your patience and your efforts!
All files are in the zip. It will take hours to upload, so probably tomorrow.
I checked your zip file. The second image (57_15_L6_3_DAPI_3.tif) is corrupted, so I did the tests only with one image (57_15_L2_3_DAPI_3.tif). Also 57_15_L2_3_DAPI_3.tif looked Gray, I don't know if that's the original look or the file is also partially corrupted, but the file could be loaded to Premier without errors.
I ran the Batch multiple times (on the same image) and didn't see any slowdown, it takes 37 seconds per one-image batch on my PC, also the memory in the Task Manager stayed the same before and after every batch.
Note, that if the Measurement Data Table is visible, it takes long time to fill it in, so be sure that the Data Table is hidden when you run the batch. (Try to close all unnecessary panels before running the batch)
I also noticed that you are adjusting threshold on the image using your own calculations. Premier has tools that can shift auto-threshold automatically, use "Bias" option and Auto-Bright as shown on the attached screenshot.
We are going to release a new Premier 9.2 version soon, if you want to try your macros in the new version now, please send email to techsupport@mediacy.com. (I did tests in 9.2)
Regards,
Yuri
Image size should be 201843 kB for the L2 image, if not something happened during zipping.
37 secs sound reasonable, this is also how it starts at my one (even bit faster). How many copies of the images did you try? Did you run the batch on the one image more often or did you run the batch on 30 copies of the image? The latter is the problem...
I could upload 10 one by one, then with copies you should have 20 which is sufficient to get slowdown. Upload in my apartment is o.k. so this would work.
Thanks for the tip with the threshold - set parameters were not optimal. I trend to rather use my logics than to use such a function from which I don't exactly know the code for science. It's also already hard to explain clients to have an adaptive thresh though in the meanwhile accepted (I also had a double adaptive with an equation used for the factor - this trains the computer to see what should most likely be measured (worked with any kind of images). The coding I have also allows to set the thresh within a corridor which is important not to get significant differences (pathological labeling can be very different from healthy - over and underestimation).
My old macro had many more functions and I just try to bring them back to the new code format one after the other.
I will send email to Techsupport and see if 9.2 solves it.
Have a nice WE!
DAniel
The multi-image batch was the problem (in this condition it slowed down). Though I think I now solved it. I added switch off of the object display after writing the vars. I now have constant time during by now 40 images in the multiple batch.
So the problem must be in the memory of the object display - if switched off after each image before closing then it seems to work.
Have a pleasant WE!
Daniel
This is the function I added.
Public Function HideCount() As SimpleScript HideCount = New SimpleScript Dim doc1 With Application.DocumentCommands.Active(HideCount) .Run(doc1) End With With Measure.MeasurementsCommands.ShowOverlay(HideCount) .CheckState = MediaCy.IQL.Application.McCommand.mcCheckState.Unchecked .Run(doc1) End With End Function
Now I have data from 120 images, and the thing with he display of bojects solved a part of it but not all.
Now the time doubled, which is no major thing (in contrast to before with nearby full stagnation).
So it's not the object display alone that keeps something in the memory but also the batch processor itself. During the first ~80 the memory was properly freed (include the Garbage macro) and nothing accumulated in RAM use. Now after a few few hours it occupies ~1.5 GB more RAM when the image is loaded.
After the evaluation I can tell more detail. After the project is finished I can switch to next version and try.
Best
Dan
The RAM eater is the "Show task manager" option (for IPP 9.1 and WS8.1 pro).
Cheers
Daniel
It's hard to tell without seeing the macro. Can you please post your project and a test image, so we can check? Do you use Premier 9.3?
Yuri
I am running v9.2. I can send you code and example images. Do you have an email?
http://www.mediacy.com/support/productupdates
Yuri (et al) --
Just a thought . . .
Is there a macro command that will allow an APP to log / report / debug.print the amount of memory that PREMIER is using?
If so, then perhaps this could be used to identify the MEMORY HOG.
I hope this information is helpful.
-- Matt
Yuri
Yuri --
Is there a command that will return the amount of MEMORY consumed by PREMIER?
If so, NAN could sprinkle that through the actual APP and find out where the MEMORY is being consumed.
Thanks.
-- Matt
Yuri,
I installed 9.3 and tested the program. The Debugging macro continues to indicate that not all images are closed after the processing complete. So, not the software version, must be my amateur coding. ;>
Please let me know how to send you the code. Thank you very much for your assistance.
Click here to upload files.
Yuri
I have uploaded the macro and an example image. I have added the macro CountImages at several points in the code to determine if images are being closed properly. Apparently not. I must not be using the correct method to close. I would be grateful for any advice you can offer.
Thanks for your help
I checked your macro and it doesn't have image leaks. In the end of your macro it does show leaking images, but they are not real leaks, they are images without references temporary existed in the memory. These images will be release by themselves by operation system during Garbage collection or you can force memory cleanup running GarbageCollect macro in the Debugging project:
So your macro is good.
What I also noticed is the prompts to close modified images. You can avoid user prompts closing images resetting Modified flag before closing, like this:
Sub closeAllImages(img, rawMask, erodeMask, tophatImg) img.Modified=False img.close img = Nothing rawMask.Modified=False rawMask.close rawMask = Nothing erodeMask.Modified=False erodeMask.close erodeMask = Nothing tophatImg.Modified=False tophatImg.close tophatImg=Nothing End Sub
Regards,
Yuri
Is there a way to avoid creating these unreferenced images that reside temporarily in memory? Or is the best solution simply to run garbage collector periodically? The macro is intended to act as a service, continuously monitoring a folder and processing images as they appear. After several dozen images, the system generates insufficient memory messages.
Thansk for your help.