I have a Complete FTP server setup in a datacenter with 100mbps connectivity.
My dev machine is on another ISP cable modem 5-7up 20 down..
I am putting together a application that will primary upload files using SFTP
using edtFTP.
After spending a lot time on testing transfer rates, I found that different machines
vary greatly in terms transfer rates. All using the same internet connection & DU meter
to review the transfer rates.
For example a Win7 64 machine could only upload at around 500kbps (not KB thats right its a 1/2 of 1 mbps)
while a vmware virtual machine inside the same host could transfer at
5-7mbps sharing the same network card being bridged.
I have found setting the ftpConnection.TransferBufferSize fixes this issue,
and using ftpConnection.TransferBufferSize =2048 for example the host machine can now
transfer at higher rates.
My question is here is there a good method to determine the optimal TransferBufferSize built into
the DLL?? or whats the best way to handle this so I can ensure my app will utilize the proper
amount of bandwidth for my clients? I suppose I could write a method to do some speed tests but
there must be a better way?? yes?