RSS/Atom feed Twitter
Site is read-only, email is disabled

tips on working with gigantic files?

This discussion is connected to the gimp-user-list.gnome.org mailing list which is provided by the GIMP developers and not related to gimpusers.com.

This is a read-only list on gimpusers.com so this discussion thread is read-only, too.

7 of 7 messages available
Toggle history

Please log in to manage your subscriptions.

tips on working with gigantic files? Casey Connor 05 May 22:01
  tips on working with gigantic files? Steve Kinney 05 May 22:41
   tips on working with gigantic files? Ken Moffat 05 May 22:52
   tips on working with gigantic files? Casey Connor 06 May 05:41
    tips on working with gigantic files? Casey Connor 01 Jun 22:38
  tips on working with gigantic files? Rick Strong 06 May 03:36
  tips on working with gigantic files? Liam R E Quin 06 May 04:48
Casey Connor
2017-05-05 22:01:15 UTC (over 7 years ago)

tips on working with gigantic files?

Hi -- I'm trying to do some simple work on a 0.5GP image in GIMP 2.9.5 (on Linux). (Image is ~23k by 22k pixels, 16bit RGBA).

I can open it fine, and it currently has two partial-canvas layers that are cross faded into each other (I'm manually stitching two slices of a big hugin panorama.)

The RAM is just about maxed -- I have 16GB, and it's enough to do some simple things to the image but I can't do much else.

I need to flatten the image into one layer so I can have a hope of doing further processing on it, but just having the image open puts it too close to the RAM limit to make that possible: I can't flatten the image, I can't export as PNG, etc. Whatever I try, the RAM soon maxes out and the system grinds to a halt, necessitating a hard power cycle.

I tried using 'convert' to convert the .xcf to PNG; no luck, as it doesn't work with 2.9.X XCF files. Maybe there is another route along these lines? (For some reason GIMP won't let me save as an older-version XCF, saying that the file is using incompatible features; not sure what: it's just two layers, one with a layer mask. The layers aren't full canvas size, maybe that's it?)

Any tricks I might be able to pull? Splitting out individual channels? Managing or virtualizing memory somehow? (I notice that my swap is never used -- not sure if it's misconfigured or if it's just not possible to use it in this case for whatever reason...) I don't care if the export procedure is slow, I just need to do it once to get the image down to a manageable size in RAM... maybe I could whip up a VM and give it tons of disk-backed virtual memory?

I tried all this on Windows, too (using a Partha GIMP) -- it actually exported something, but it was mostly black with bits of corrupt tiles here and there. Then I tried again with a larger tile cache and it crashed the OS and corrupted my bios somehow... yikes.

Unfortunately I can't afford more RAM.

Thanks for any ideas!

-c

Steve Kinney
2017-05-05 22:41:03 UTC (over 7 years ago)

tips on working with gigantic files?

On 05/05/2017 06:01 PM, Casey Connor wrote:

Hi -- I'm trying to do some simple work on a 0.5GP image in GIMP 2.9.5 (on Linux). (Image is ~23k by 22k pixels, 16bit RGBA).

I don't think this will solve the problem you described, but I have found that clearing the undo history after every operation done on open files frees a lot of memory in cases where there is /almost/ too much data for the system to cope with.

Another possibility: Crop the images you are trying to merge down to just the bits that overlap, process those, then add the cropped parts back in. Again, this might not be practical in your situation, but it's all I can think to suggest from similar problems I have run into.

If friend with a more powerful box will let you do this thing on that machine, a USB stick and a copy of the portable version of the GIMP might do the trick...

:o/

Ken Moffat
2017-05-05 22:52:16 UTC (over 7 years ago)

tips on working with gigantic files?

On Fri, May 05, 2017 at 06:41:03PM -0400, Steve Kinney wrote:

On 05/05/2017 06:01 PM, Casey Connor wrote:

Hi -- I'm trying to do some simple work on a 0.5GP image in GIMP 2.9.5 (on Linux). (Image is ~23k by 22k pixels, 16bit RGBA).

I don't think this will solve the problem you described, but I have found that clearing the undo history after every operation done on open files frees a lot of memory in cases where there is /almost/ too much data for the system to cope with.

Another possibility: Crop the images you are trying to merge down to just the bits that overlap, process those, then add the cropped parts back in. Again, this might not be practical in your situation, but it's all I can think to suggest from similar problems I have run into.

If friend with a more powerful box will let you do this thing on that machine, a USB stick and a copy of the portable version of the GIMP might do the trick...

Yeah, the xcf files in 2.9 are big. I'm tempted to suggest adding a huge swap file - not sure if it would help (watching a system swap heavily is not pretty).

ĸen

I live in a city. I know sparrows from starlings.  After that
everything is a duck as far as I'm concerned.  -- Monstrous Regiment
Rick Strong
2017-05-06 03:36:22 UTC (over 7 years ago)

tips on working with gigantic files?

Just out of curiosity, why such a huge file? What do you intend to do with the finished product if you could get your computer to work (if you don't mind me asking)?

Rick S.

-----Original Message----- From: Casey Connor
Sent: Friday, May 05, 2017 6:01 PM
To: gimp-user-list@gnome.org
Subject: [Gimp-user] tips on working with gigantic files?

Hi -- I'm trying to do some simple work on a 0.5GP image in GIMP 2.9.5 (on Linux). (Image is ~23k by 22k pixels, 16bit RGBA).

I can open it fine, and it currently has two partial-canvas layers that are cross faded into each other (I'm manually stitching two slices of a big hugin panorama.)

The RAM is just about maxed -- I have 16GB, and it's enough to do some simple things to the image but I can't do much else.

etc.

Liam R E Quin
2017-05-06 04:48:36 UTC (over 7 years ago)

tips on working with gigantic files?

On Fri, 2017-05-05 at 15:01 -0700, Casey Connor wrote:

[...]

I need to flatten the image into one layer so I can have a hope of doing further processing on it, but just having the image open puts it too close to the RAM limit to make that possible: I can't flatten the image, I can't export as PNG, etc. Whatever I try, the RAM soon maxes out and the system grinds to a halt, necessitating a hard power cycle.

First, set GIMP's tile cache to about three quarters of physical memory (I think I have mine at 16GBytes right now actually; I have 32G of RAM).

Next, add a swap partition or a large swap file to Linux. If you have the space, add 30GBytes or more. This will be slow, but it will stop the system from crashing.

As others have said, show the undo history and press the Trash icon at the bottom after every operation. If you make a mistake there will be no undo available... but it will save a lot of memory. (I prefer doing this to reconfiguring gimp)

To let GIMP fork the png process,

echo 1 >  /proc/sys/vm/overcommit_memory as root -- or, if you prefer (careful with the quotes here): sudo sh -c "echo 1 >  /proc/sys/vm/overcommit_memory"

Don't leave your system running like this long term. It lets processes allocate more memory than you have!

it's just two layers, one with a layer mask. The layers aren't full  canvas size, maybe that's it?)

I can flatten the image for you here if you need it.

I work with large images all the time - e.g. scanning A3/tabloid at 2400dpi - although the layer mask will considerably increase the amount of memory needed.

Liam (ankh on IRC)

Liam R E Quin 
Web slave at www.fromoldbooks.org -
Words and Pictures from Old Books
Casey Connor
2017-05-06 05:41:31 UTC (over 7 years ago)

tips on working with gigantic files?

I don't think this will solve the problem you described, but I have found that clearing the undo history after every operation done on open files frees a lot of memory in cases where there is /almost/ too much data for the system to cope with.

Thanks, great tip -- As you say, I think it might not help here because I'm opening the file and immediately trying to export (i.e. no undo history) but that's great to keep in mind for the future.

Another possibility: Crop the images you are trying to merge down to just the bits that overlap, process those, then add the cropped parts back in.

Ah! Yes! I don't think I can combine them in gimp (due to RAM), and I don't think I can even crop the image now (same RAM overflow issue) but I bet I can crop to the overlap areas, do the overlap, export that, and then combine the three sections (top, overlap, and bottom) with the "montage" command. Thanks for the insight!

@Ken & @Liam Re: swap -- I did try this with a 16GB swap enabled and it never seemed to swap; but I'll try it with overcommit enabled as Liam suggested (you implied that that will trigger GIMP to fork the PNG export?). Currently, when it maxes out on RAM, everything slows to a crawl; as in, moving the mouse causes the mouse to update its position once every few seconds; takes minutes just to move the mouse to the corner of the screen, etc. GUI interaction (e.g. trying to close a window or switch to a v-term) doesn't seem possible, no matter how long I wait. In other words, the system is brought to its knees. Is that what heavy swapping can look like, and what I should expect even if overcommit is enabled? I just want to know when I should let it go overnight and expect that it's slowly working through the file, and when it's actually just spinning wheels and not getting anything done...

I will also play with tile cache size and so forth (it has been at 8GB; undo history was also at 8GB, but again, I was just opening and then exporting...)

@Rick:

Just out of curiosity, why such a huge file? What do you intend to do with the finished product if you could get your computer to work (if you don't mind me asking)?

Very fair question. :-) I just wanted to make a zoomable 'gigapixel' image for the heck of it. No true demand that it be so huge. Maybe I'll print it large-scale some day if it turns out well. For now I'm mainly just honing my skills on the techniques and learning what my computer can handle. (And the only reason it's sliced in two parts is that hugin can't write out the image in one piece without running out of RAM.)

Thanks, everyone, for the help, -c

Casey Connor
2017-06-01 22:38:28 UTC (over 7 years ago)

tips on working with gigantic files?

Hey, just wanted to follow up on the gigapixel editing...

I tried the overcommit_memory thing, clearing undo history, making a big swap, changing tile size, etc. Nothing seemed to help with my particular image on my particular system, but I appreciate the tips. Guess I just need a lot more memory. :-)

I wasn't able to do any editing or adjustment on the image, since most tools would crash out the program, but I was able to stitch it all together in a mostly satisfying way by piecing it out and using montage. I then put it online using OpenSeadragon.

I think for now I will not work with such large images. It's just too painful. :-) But it's nice to have some tools to work with medium size stuff.

Thanks for the help!

-Casey

P.S. - Here's the pic, if anyone's interested: http://caseyconnor.org/pub/image/osd/nfalls/

Slightly edited (small) version is here .