Out of memory error using mainframe copybook as a subfilter
Posted: Wed Nov 30, 2011 9:31 am
I have a number of mainframe files that I want to convert in a single batch using the AWESOME "Mainframe Copybook Filter" in TextPipe Pro. The files have a bunch of different formats, but I can easily determine the format from the filename so I can use the right copybook filter. My filter list look like this (I have more than 3 filetypes, but you get the idea):
* Restrict to filenames matching [filename pattern A]
* -- Mainframe Copybook Filter [copybook A]
* Restrict to filenames matching [filename pattern B]
* -- Mainframe Copybook Filter [copybook B]
* Restrict to filenames matching [filename pattern C]
* -- Mainframe Copybook Filter [copybook C]
This works exactly as expected for small files (less than ~1 GB). However, for larger files, I get a "out of memory" from TextPipe at or near the end of the Mainframe copybook filter, even though windows performance monitor shows that I still have around 2 GBs of physical RAM free (the memory usage does slowly increase though as TPP processes the file). If I create a filter list that contains just a single Mainframe Copybook filter (NOT as a subfilter), I can process multi-GB files with no problem, and the memory usage stays virtually constant the entire time. I'm running Windows 7.
I don't understand why TPP is running out of error in this situation. Is there another/better way to accomplish this? I have tried both TPP 8.9.6 and 8.9.9 and the "out of memory" error occurs in both versions. Thanks!
Kevin
* Restrict to filenames matching [filename pattern A]
* -- Mainframe Copybook Filter [copybook A]
* Restrict to filenames matching [filename pattern B]
* -- Mainframe Copybook Filter [copybook B]
* Restrict to filenames matching [filename pattern C]
* -- Mainframe Copybook Filter [copybook C]
This works exactly as expected for small files (less than ~1 GB). However, for larger files, I get a "out of memory" from TextPipe at or near the end of the Mainframe copybook filter, even though windows performance monitor shows that I still have around 2 GBs of physical RAM free (the memory usage does slowly increase though as TPP processes the file). If I create a filter list that contains just a single Mainframe Copybook filter (NOT as a subfilter), I can process multi-GB files with no problem, and the memory usage stays virtually constant the entire time. I'm running Windows 7.
I don't understand why TPP is running out of error in this situation. Is there another/better way to accomplish this? I have tried both TPP 8.9.6 and 8.9.9 and the "out of memory" error occurs in both versions. Thanks!
Kevin