File copy timesout when file is too big

File copy times out when file is too big due to an problem with paramiko
write implementation.

The fix proposed comes in two parts:
 1) Changing paramiko file write to putfo;
 2) Increase the default copy file timeout.

Change-Id: I9e9d2873d95923cbd8c4729b3a674dfb1b8c2ec1
Story: #1705762
This commit is contained in:
Telles Nobrega 2018-03-13 00:01:03 -03:00
parent 3734ef7f5e
commit 68b447d7f9
1 changed files with 3 additions and 4 deletions

View File

@ -73,7 +73,7 @@ ssh_config_options = [
'ssh_timeout_interactive', default=1800, min=1,
help="Overrides timeout for interactive ssh operations, in seconds"),
cfg.IntOpt(
'ssh_timeout_files', default=120, min=1,
'ssh_timeout_files', default=600, min=1,
help="Overrides timeout for ssh operations with files, in seconds"),
]
@ -270,9 +270,8 @@ def _get_http_client(host, port, proxy_command=None, gateway_host=None,
def _write_fl(sftp, remote_file, data):
fl = sftp.file(remote_file, 'w')
fl.write(data)
fl.close()
write_data = paramiko.py3compat.StringIO(data)
sftp.putfo(write_data, remote_file)
def _append_fl(sftp, remote_file, data):