Memory Issue when huge directory structures
Posted: Tue Jun 01, 2010 4:13 am
I am testing the product, but we have MANY generated files in a very deep directory structure.
Number of files = above 100,000
Directory levels = 10 or higher
when running a simple Extract for search a date "01-01-2002" for example, the memory goes up to maximum, so on a 4GB ram box, the memory occupied is 3.5-3.6GB and it takes a very long time also to build the list of files, so before textpipe even begins to search.
It seems it is caching the file structure and names, which is weird. Maybe we doing something wrong, not sure.
We use wildcard like c:\direcotry\*.* and recursive.
If this is normal behaviour for textpipe, then it is unfortunately not the product we need.
Thank you for answering so quickly.
Number of files = above 100,000
Directory levels = 10 or higher
when running a simple Extract for search a date "01-01-2002" for example, the memory goes up to maximum, so on a 4GB ram box, the memory occupied is 3.5-3.6GB and it takes a very long time also to build the list of files, so before textpipe even begins to search.
It seems it is caching the file structure and names, which is weird. Maybe we doing something wrong, not sure.
We use wildcard like c:\direcotry\*.* and recursive.
If this is normal behaviour for textpipe, then it is unfortunately not the product we need.
Thank you for answering so quickly.