As the pending queue size increases, the CPU time seems to increase to the point that it reached 100%. Modifying (selecting and deleting) items in the queue can take a LONG time. Minutes even.
I'm suspecting that the pending queue is maintained as a sorted list in memory. And deleting item from the top or middle means shifting the entire structure forward.
100% CPU load on transfer seems related to pending Q size
More data
I'm seeing high CPU utilization while it is processing when the queue is LONG. But I don't have a good feel for LONG. I was seeing hevy CPU utilization with a queue length of 7000+ during normal processing, but at 1500 the CPU is less than 10%. I'm dealing with a 2.4 Ghtz CPU and 1GB of main memory.
BTW the COREFTP process with had 250MB+ of physical memory allocated.
BTW the COREFTP process with had 250MB+ of physical memory allocated.
Going to command line helps
It seems that only the command line mode doesn't try to build and maintain a "pending" queue of files. So switching to command mode has solved my personal problem.
Also it helps as I'm trying to backup a Windows user directory. With the GUI it's tempting to simply drag the top level of the user from c:/documents and settings/ to the target. That quick and dirty, but it captures such large directories as the internet caches for IE and Firefox, and the temp driectories. That's where the 10s of thousands of files are. Those files are also only caches so there's no need to backup them up. Worse those files could be replaced with other files the very next day and since CoreFTP doesn't have a mode to delete files on the target that are gone from the source, the directory size would just grow and grow.
In command mode it's relatively easy to only copy the read data file directories (14 in my case) to the remote location. temp and cache directories are avoided. Yet the driectory structure on the remote is such that one GUI drap would retore it back to the right place.
Also it helps as I'm trying to backup a Windows user directory. With the GUI it's tempting to simply drag the top level of the user from c:/documents and settings/ to the target. That quick and dirty, but it captures such large directories as the internet caches for IE and Firefox, and the temp driectories. That's where the 10s of thousands of files are. Those files are also only caches so there's no need to backup them up. Worse those files could be replaced with other files the very next day and since CoreFTP doesn't have a mode to delete files on the target that are gone from the source, the directory size would just grow and grow.
In command mode it's relatively easy to only copy the read data file directories (14 in my case) to the remote location. temp and cache directories are avoided. Yet the driectory structure on the remote is such that one GUI drap would retore it back to the right place.