Memory Issue when huge directory structures

Get help with installation and running here.

Moderators: DataMystic Support, Moderators, DataMystic Support, Moderators, DataMystic Support, Moderators

Post Reply
rumble
Posts: 1
Joined: Tue Jun 01, 2010 4:06 am

Memory Issue when huge directory structures

Post by rumble »

I am testing the product, but we have MANY generated files in a very deep directory structure.
Number of files = above 100,000
Directory levels = 10 or higher

when running a simple Extract for search a date "01-01-2002" for example, the memory goes up to maximum, so on a 4GB ram box, the memory occupied is 3.5-3.6GB and it takes a very long time also to build the list of files, so before textpipe even begins to search.
It seems it is caching the file structure and names, which is weird. Maybe we doing something wrong, not sure.
We use wildcard like c:\direcotry\*.* and recursive.

If this is normal behaviour for textpipe, then it is unfortunately not the product we need.

Thank you for answering so quickly.
User avatar
DataMystic Support
Site Admin
Posts: 2227
Joined: Mon Jun 30, 2003 12:32 pm
Location: Melbourne, Australia
Contact:

Re: Memory Issue when huge directory structures

Post by DataMystic Support »

Hmmm - it sounds like we need to handle filenames through a pipe as well as the data itself.
User avatar
DataMystic Support
Site Admin
Posts: 2227
Joined: Mon Jun 30, 2003 12:32 pm
Location: Melbourne, Australia
Contact:

Re: Memory Issue when huge directory structures

Post by DataMystic Support »

We have rebuilt TextPipe so that it has a file-gathering engine and a text-processing engine working in parallel.

Text processing now starts immediately - no matter the size of the job.

The file engine sleeps if there are too many files pending, and will wake up again when most have been processed.

You can checkout a technical preview of v8.6 here:
http://www.datamystic.com/textpipeprobeta.exe
Post Reply