Get large files from FTP with python lib -


i need download large files (>30gb per file) ftp server. i'm using ftplib python standardlib there pitfalls: if download large file, can not use connection anymore if file finishes. eof error afterwards, connection closed (due timeout?) , each succeeding file error 421.

from read, there 2 connections. data , control channel, data channel seems work correctly (i can download file completly) control channels times out in meantime. read ftplib (and other python ftp libraries) not suited large files , may support files around 1gb. there similar question topic here: how download big file in python via ftp (with monitoring & reconnect)? not quite same because files huge in comparison.

my current code looks this:

import ftplib import tempfile  ftp = ftplib.ftp_tls()  ftp.connect(host=server, port=port) ftp.login(user=user, passwd=password) ftp.prot_p() ftp.cwd(folder)  file in ftp.nlst():     fd, local_filename = tempfile.mkstemp()     f = open(fd, "wb")     ftp.retrbinary('retr %s' % file, callback=f.write, blocksize=8192)     f.close() 

is there tweak or library can use, support huge files?

if experience issues standard ftp, can try using different protocol designed handle such large files.

a number of suitable solutions exist. rsync way start.


Comments

Popular posts from this blog

android - MPAndroidChart - How to add Annotations or images to the chart -

javascript - Add class to another page attribute using URL id - Jquery -

firefox - Where is 'webgl.osmesalib' parameter? -