I am trying to backup about websites by downloading 50 gzips created as cpanel backups.
The download works fine on small files but dies on larger ones.
Some are quite large (350MB) and these are the most problematic... These downloads die after about 50MB. The status bar just stops (I've waited up to an hour in case it was just "thinking") I can get the download to resume by disconnected and redoing the download (dragging the same file to the location where the partial download went) and then clicking "resume" in the "overwrite?" dialog.
My question is about my settings: What are the optimal settings for keep alive, passive mode, and timeout when it comes to large files.
Where else might there be an issue... Firewall, Spy Sweeper, Nod32... I think it is odd that the 30MB files are not a problem but the big ones are...
I have a 512K DSL connection and the CoreFTP Pro is on a fast 3.2Gz box with 1GB RAM, big fast drive, etc
I don't know if Windows firewall is active or not.
Any input appreciated.
Downloading Big Files - Best Settings? Potential Problems?
-
- Posts: 1
- Joined: Thu Jun 02, 2005 2:48 pm