>-----Original Message----- >From: owner-gftp-users@xxxxxxxx [mailto:owner-gftp-users@xxxxxxxx] On Behalf Of >Alonso Acuña >Sent: Friday, March 26, 2010 6:40 PM >To: gftp-users@xxxxxxxx >Subject: Re: [gftp] marketing: target use case and target audience of gftp > > >> gFTP still doesn't handle tranfers well, when it's tens of thousands of >> files and thousands of folders. It just spins for a while and quitly >> exits/crashes. >> >> It's usable, especially for making ssh2-connections, but it doesn't do large >> transfers at all. >> >I believe it is a good option for single large transfers because of its >resume capability. I recently uploaded a 2.5Gb file over a slow link for >36hours, with one disconnection and resume. It definitely is; for single large uploads it's great, even fantastic. Have you however tried transferring, say approx 30000-40000 files in about 4000-10000 folders in one go? Question may seem strange, but this is rather common here, when transferring molecular modeling projects from one PhD to another. Only solution to large file transfers with multiple folders and files in the (tens of) thousands, is to get them folder-by-folder. With 4000 folders, or even more in some cases(!), this is a daylong chore, as you need to manually start each transfer. Sftp is a solution of course, but doesn't give the user a good overview of the things about to be transferred; they prefer the GUI point-and-clicky-thingy gFTP offers. -- /Sorin
Attachment:
smime.p7s
Description: S/MIME cryptographic signature