OutOfMemoryError
Hi I have a project coming up. I work with exporting PNG, and pretty big. Have 16GB ram and a 4ghz 6 core/ 6thread cpu on pc but got memory error. Any way to increase memory use in Nodebox? I can see in task manager it uses only half (or 9GB available 6GB in use atm).
Maybe I use nodebox for images that is not suposed to, but it's my favorite program. Now I struggle. Photoshop or illustrator had no problem at similar sizes. 1400mmx2000mm / 16571x 23658px
Wish there were an option for memory performance like in most other programs.
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac
Support Staff 1 Posted by john on 10 Apr, 2019 05:02 AM
Hi again Isg,
I'm not sure about your first question. Here is a thread about increasing memory from back in 2016:
http://support.nodebox.net/discussions/support/310-out-of-memory-are-you-trying-to-process-and-infinite-list-with-a-csv-file
I don't know if that advice still applies in 2019. There may still be a limit due to the aging Java virtual machine in the bowels of NodeBox. I am adding Frederick to this discussion, but he's been too busy to help out in the forum lately.
16571 x 23658 is pretty big (for NodeBox). There seems to be 14400 x 14400 limit when exporting PDFs - I believe this is a PDF limit, not a NodeBox limit. I think the largest PNG export I've done was 14000 x 10500.
When it comes to memory errors, though, it's the number of objects, not the dimensions that matters. I can draw millions of simple rectangles, but for more complex paths like text I hit the wall at around 30,000. If you are drawing a lot of text that could be a problem. (If so I have some workarounds to reduce the problem somewhat).
As I said in my earlier note, it's much more common for memory errors to happen due to inefficiencies in your code. It's fairly easy in NodeBox to accidentally draw a million shapes on top of each other when you only meant to draw a thousand. You can try rendering one node a time and looking at things in Data view to spot that issue. But I think you are already aware of that.
If you add a 50% scale node at the bottom of your network and cut document size in half does that solve the memory issue? (Shouldn't matter for vectors, but might matter when converting to pixels for display in or out of NodeBox). If not, and if you don't have any inadvertent inefficiencies, you may just have too many objects. If you ungroup beneath your final combine node, how many paths do you have?
You can send me your NodeBox file if you want and I can see if I have any better luck on my Mac. Wish I could be more help.
John
Support Staff 2 Posted by john on 10 Apr, 2019 05:05 AM
Oh - one other thing to try. The next time you get a memory error, save your NodeBox doc, reboot your PC, relaunch NodeBox, and try again.
NodeBox does have some memory leaks which can crop up after you've been at it for awhile. Sometimes rebooting helps.
Support Staff 3 Posted by Frederik De Ble... on 10 Apr, 2019 10:43 AM
Are you on Mac or PC? On Mac it's pretty easy to increase the memory size. On PC, I think I need to make a specific build.
NodeBox has been around for a while so we've been steadily increasing the memory limits of current PCs, but it's still quite conservative. So I can create a specific "high memory" build for you if it's required.
But also, what John said :-) Make sure you don't have thousands of identical shapes overlapping when exporting a PDF.
Support Staff 4 Posted by john on 27 Apr, 2019 03:44 PM
Still there Isg?
One more thought re out of memory errors...
I just encountered the same thing myself, not because my image was too big, but because I was doing too much number crunching on too big of a file. I increased my application memory as per Frederick's instructions, but that only helped a little.
I did a careful study of what was causing the problem in my case. It all came down to slicing. It turns out that if you do repeated slicing of a large table sooner or later memory management / garbage collection kicks in. Up to that point the processing time was linear and reasonably fast as I increased the number of items I needed to process. But beyond a certain number of items the processing time started to increase exponentially and not long after that I hit the "out of memory" wall. (I was even able to watch the memory management kick in using a Mac's Activity Monitor.)
This probably does not apply to your situation, but the workaround I came up would probably work for you too: break up your work into smaller pieces then stitch them together.
In my case I processed the data in 6 chunks of 800 units at a time, saved each pass as a separate CSV, then read them all back in and stitched them into a single CSV with a combine node.
You could probably do something similar. Make the top half of your poster in one pass, the bottom half in a second pass, then stitch them together. You could do the stitching in or out of NodeBox.
If you can save each pass as an SVG, you can then load them all back in inside of NodeBox and combine them there. Once combined, you could then save the whole thing in whatever format you want.
If you need to save each half (or quarter or whatever) as a PNG you will then need to fit them back together in Photoshop or Sketch or whatever.
Stitching is a pain, but it's not too bad. It's there as a workaround if you need it.
John
5 Posted by isg on 27 Apr, 2019 04:03 PM
Hi! Thanks for all the answers, and sorry for late follow-up! The way I fixed the issue was to rebuild some scatter node and lower some of the computation. But I also restarted as Fredrik said, maybe that have worked too. Would be interesting with a PC version that can utilize <16GB ram and more multithread processing.
As you mention @john the size itself really doesn't matter. Whatever size you want is what I love with Nodebox, but yes " too much number crunching on too big of a file" for sure. Of course bigger sizes means more information. I work with granulars and many scatterings on one node build and many random maths, then combine multiple node rows and export PNG for prints or further work in other programs.
"break up your work into smaller pieces then stitch them together" This I'm trying to already as much as possible.
Really thanks for all the answers and suggestions! Will follow the news for a version that adresses this, as well as try to save SVG pass/load back method. Although these days I'm working on smaller sizes max 2meters, not 4 meters :) Theres a big difference.