|[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]|
Image is 1280x720 RGB, largest layers are 4800x3400 or so. Maybe 10 layers at most at any one time.
The original intent was to manipulate, copy, and create layers as needed from within a single image, make a new layer from visible, write that out, then remove old layers and start over for the next frame. I can get individual commands and short sequences to appear to have the desired effect when typed into the console (after a lot of reviewing source on github -- this is a pretty cranky API; don't get me started on the *_insert_* methods "parent" parameters!). Thinking that I was missing something about properly managing gimp or python, I tried isolating a much simpler set of commands, basically copying a reference layer from one image into a new displayed image, then destroying that new image and starting over. Just entering them into the console repeatedly I can see the same gimp process memory usage jump up each iteration, at first accumulating just under 1MB on each step, then apparently growing steadily larger. I can see from the layer toolbox that my layer count doesn't seem to grow larger than intended, though I suppose there could be layers not properly inserted or deleted that might not show in the toolbox.
I'm stumped on how to use python gimp to create this animation (and I came to gimp because synfig couldn't handle all my over-one hundred source images). I haven't found any example scripts that deal with this scale yet. Do the experts on this board have any corrections to what I'm doing wrong, or any tips or tricks that I might apply to manage this issue? For example, if it's not my script, are certain methodologies more leaky than others (such as opacity changes vs visibility changes, copy-pasting visible to a re-used layer vs creating a new layer from visible, layer copies within an image vs across images, layer creation/destruction vs image creation/destruction)?
Here's the simple command sequence that demonstrates the "leak". I start with an image containing an RGB layer 4800x3400 or so at layer (there's a transparent top layer). Open task manager and watch memory use in the gimp process. Open the python-fu console, initialize gp =gimp.pdb and src =gimp.image_list(), then iterate the following:
img =gp.gimp_image_new (1280, 720, 0) dsp =gp.gimp_display_new (img) lyr =gp.gimp_layer_new_from_drawable (src.layers, img) img.insert_layer (lyr, position =0) gp.gimp_display_delete (dsp) I see memory use jump after each iteration. Any help is appreciated. twv@ _______________________________________________ gimp-developer-list mailing list gimp-developer-list@xxxxxxxxx http://mail.gnome.org/mailman/listinfo/gimp-developer-list
[Home] [Video For Linux] [Photo] [Yosemite News] [Yosemite Photos] [Yosemite Book Store] [gtk] [GIMP for Windows] [KDE] [Scanner] [Memory] [GEGL] [Gimp's Home] [Gimp on Windows] [Steve's Art] [Webcams]